Algorithms can be used to "assess the likelihood of re-offending and inform decisions about policing, probation and parole", the UK government in a Wednesday press release, adding that the CDEI aimed to ensure that the technologies will be used ethically.
Programmes such as the Harm Assessment Risk Tool (HART) in Durham are already in place, which helps police to decide if persons on parole are likely to commit future offences.
The CDEI supports the government's 2-Year Industry Strategy and was set up to ensure that data-driven technologies and AI were "used for the benefit of society", including partnering with the Race Disparity Unit to probe potential racial bias in decision-making in the criminal and justice systems.
Digital secretary Jeremy Wright said at a Downing Street event: "Technology is a force for good which has improved people's lives but we must make sure it is developed in a safe and secure way.
The Centre for Data Ethics and Innovation was set up to help achieve that aim and "keep Britain at the forefront of technological development," Mr Wright added.
Roger Taylor, Chair of the Centre for Data Ethics and Innovation, said: "The Centre is focused on addressing the greatest challenges and opportunities posed by data driven technology. These are complex issues and we will need to take advantage of the expertise that exists across the UK and beyond. If we get this right, the UK can be the global leader in responsible innovation.
Other applications for such technologies include screening CVs and shortlisting candidates in job recruitment processes to avoid racial or gender bias, as well as boosting innovation in the digital economy. Financial services can also benefit from AI machine-learning by deciding who can receive loans, but such technologies have increased scrutiny on transparency and fairness of outcomes, the CDEI said.