"The UK will also need the digital infrastructure to make AI a success. As of February 2018, only 3% of the UK is covered by full-fibre broadband"

The Lords Committee sees AI technologies as both a promise and a threat to the UK economy

Hina Pandya for Autonomy

This week the report from the cross-party Select Committee on Artificial Intelligence (AI) appointed by the House of Lords on 29 June 2017 was released. Their remit: “to consider the economic, ethical and social implications of advances in artificial intelligence” using a plethora of information, testimony and discussion contributed by academics, and those in business.

The Select Committee concluded that the UK is in a unique position to be a driver to influence the developments of AI and its growth, citing the current situation of a firm legal system, the ever growing start-up community, technological development, and of course the strong academic research culture as contributing assets.

The much-needed increase in productivity AI would bring to the UK is articulated starkly by the Centre for Data Innovation who say:

“Unless Britain can find a way to boost productivity, social and political crises will continue as incomes stagnate”

The committee is clear that the UK, which lags behind Germany by 35% and behind the US by 30%, could use the opportunity AI would offer to improve on the current 0.9% growth in the economy (2017).

Interestingly the inquiry found that not many people are aware of AI and how it affects their lives, recalling the quote from AI pioneer John McCarthy: “as soon as it works no one calls it AI anymore.” Technology often becomes almost unnoticeable – insidious perhaps –  and is deeply integrated into our lives, with people using Apps to monitor their sleep, to receive news tailored to their interests, or in order to recreate voice patterns of any individual and manipulate images. Even NatWest a major UK bank has begun to use “Cora”, an in-branch AI personality designed to assist with basic customer queries.

Many people are aware of machine learning, as confirmed by 76% of respondents to a survey run by the Royal Society – but the survey also showed that very few knew how it worked, with less than 3% saying they knew a great deal about it.

It is therefore an obvious concern that this ever-changing landscape of AI technology will shape the workforce in detrimental ways, leaving people with no job or forcing them into different and less comfortable working arrangements. This problem is exacerbated by the fear that AI will result in “superintelligent machines”, which the committee argues is fueled by Hollywood and news media depictions. These images in fact divert attention from the real and immediate risks and problems: unemployment and underemployment.

Problems with AI developments could leave many behind; in the past it has affected minorities, women, working mothers and people with disabilities in particular. Inequality too may deepen across the country. It is estimated that less mobile, smaller towns and deprived areas of the UK such as part of the Midlands and north of England are likely to struggle the most as a result of their jobs being replaced or displaced by AI.  The report notes that according to one witness (Contact Centre Systems Ltd.), AI utilised in call centres in the UK could manage 40% of all calls by 2020, and 70% by 2025. Just under 1 million people are employed by this industry, so this means 400,000 and then 700,000 people will need retraining for new roles.

One possible solution to this changing landscape of work is Universal Basic Income (UBI): an amount of money, replacing some or all benefits (depending on the model), and given to a person working or not, currently being experimented with in parts of Scotland and many other countries. Scottish first minister Sturgeon has stated it is ‘important to be open-minded about ways that we can support individuals to participate fully in the new economy’. Critics say UBI takes away the purpose and meaning that work provides, which could not be replaced by a ‘simple cash payment’. Designing any potential schemes may take up to 18 months, and it will be a long time after before definitive results are able to be gathered for the UK.

From this the Committee concludes ‘everyone must have access to the opportunities provided by AI’ but also inserts that they are also very aware of the possible inequalities that might result, determining that these should be ‘explicitly addressed in Industrial Strategy’. The committee remain unconvinced of the merits of adopting a UBI policy however.

Retraining is the way forward say the Select Committee, who believe there will be widespread disruption to jobs over the coming decades, blue and white-collar. They have put forward the recommendation that a National Training Scheme be developed and financed by government, matched by industry.

The cross-party select committee are under no illusion though that this will require significant political commitment for it to be a success. Retraining may not help everyone of course: contributor Professor Susskind stated truck-drivers made redundant by AI driving technologies would be unlikely to have the educational background to enable them to retrain to make a living.

Encouragingly, with DeepMind’s project breaching the Data Protection Act with people’s medical data, and the example of two data sets revealing when houses were empty (and ripe for burglary) on ‘pleaserobme.com’, the Committee are keen to uphold ethics and not repeat the mistakes of the past.

They are astutely aware that in order for AI to be effective it must capture a great deal of data on a great number of people, and so the Committee recommends that developers of any new AI applications are made aware of the ethical implications of the data they use, and of the risk that their work could be used improperly and possibly for malevolent purposes. Automated guided weapons are included under the rubric of these ethics.

The report consulted various bodies and researchers to collate a plan for regulation. ‘Blanket’ regulation has not been considered at this stage. As AI is used in so many sectors, the report makes clear that the impact of AI should continue to fall to sector regulation, but states that these sectors must be given resources and powers to enable them to enforce these effectively.

Most importantly the UK will also need the digital infrastructure to make AI a success. As of February 2018, only 3% of the UK is covered by full-fibre broadband; there is concern in the report that there is not enough ‘impetus’ in order for the country to take full advantage of the potential that AI offers and urges the Government to invest in 5G and ultrafast broadband.

The report includes historic evidence that technology has brought many advancements over the ages, in manufacturing, clothing, etc, and similar fears existed then about jobs disappearing. Instead, new jobs simply appeared, and it is noted that we may not know what these roles would be as yet.

Although many who gave witness and contributions to the Select Committee said that they felt there was a need for action to be taken on the possibility of job losses, they also agreed that these predictions were ‘evidence-light’.

The report clearly urges towards the UK being a leader in the ethics of AI, addressing all the issues that come with it and all the mistakes and false starts that have occurred in the past. It may well be that AI presents no problem to the job market, but the report is a good summary of what we know we don’t know.

AI is recognised as one of the definitive phenomena of our times and will no doubt shape our lives for years to come. The report shows that the members of the AI Select Committee in parliament are keen for the UK to keep up. Yet they are all too aware that while establishing strong ethical guidelines deals with one part of the problem, there is no certainty at all that the current UK government are fully committed to making the UK technologically able.  Only time will tell.

"Retraining is the way forward say the Select Committee"

Hina Pandya has been a practising freelance journalist for 10 years. Previously she worked in policy for the Department for International Development and the Independent Police Complaints Commission. She worked many years ago, with more creative flair, at Industrial Light and Magic in California, making models for film, escaping the real world issues she writes about today.