The Defence and Security Accelerator (DASA), the UK's agency tasked with innovating Britain's defence industry, will host its Behavioural Analytics Phase 1 Showcase aimed at building predictive technologies to counter terrorist attacks and spotting alleged fake news, it announced on Monday.
The event, set to take place on 7 November at the Museum of the Great Western Railway (STEAM) in Swindon, will draw talent from groups receiving £2.4m in funding from the Porton Down-based agency, who will compete for additional funds and the possibility to incorporate their innovations into future DASA technologies.
DASA delivery manager, Rachael Colling, called the event "a great opportunity" to learn how suppliers developed their solutions and how organisations could collaborate during the second phase of government-backed competition.
Phase 1, launched in October last year, called for solutions to provide "context-specific" insights into "individual, group and population behaviour", as well as how learning how they were "likely to act in the future", and holds a theory and method-based approach to research. Phase 2 is set to launch at the end of the year.
— Alexander Giles (@SpeartipGlobal) October 17, 2019
A total of 29 projects were funded and the competition was one of several research programmes to explore the UK defence and security industry's behavioural analytic capabilities.
RAND Europe, the European division of US thinktank RAND Corporation, received the largest share at nearly £100,000.
The event's competition technical lead, Louisa Bryson, said: “The increasingly complex operational environment and the changing character of conflict will require an enhanced understanding of human behaviour to deliver strategic and operational effects.
She added: "Using the DASA contractual processes this competition enables UK defence and security to exploit the best ideas, products and services and fast-track them to operational use whilst simultaneously developing external capability in the industry and academic supply chain.”
— Stewart Martin-Haugh (@StewMH) October 15, 2019
The latest event follows an increasing trend of the British government to rely on artificial intelligence for security and administrative matters, namely after an inquiry from the UK Centre for Data Ethics and Innovation (CDEI) announced in March it would launch similar predictive behavioural algorithms strategies to tackle crime and fight discriminatory employment decisions, among others.
Whilst the CDEI said it would use its technologies ethically, the agency aims to probe the likelihood of reoffenders and advise the UK government on policing and parole, among others, under its 2-Year Industry Strategy which may come back into the spotlight under UK prime minister Boris Johnson's policing initiative.
Prime Minister Boris Johnson revealed his review of the UK's sentencing policy in August alongside further details during a Queen's Speech on 14 October, which aimed at tackling violent and sexual offences across Britain. The review pledged to stop the "cycle" of crime in the UK by blocking the release of violent offenders serving partial sentences by extending their time spent in prison.
The UK government also pledged a further £250m to the National Health Services (NHS) to build an AI 'NHS X' complex to diagnose and detect Alzheimer's, cancer and other diseases in a bid to 'free up' staff care for patients, Mr Johnson said in early August.