Chances are good that you’ve heard of predictive analytics, meaning the extraction of information from existing data pools as a means of identifying patterns, and forecasting certain outcomes or movements.
Chances are also good that you’ve heard of predictive analytics being used to help law enforcement officials identify areas where they think crimes are most likely to occur, such that a stronger presence can be dispatched to these so-called trouble spots.
As interesting as this is, consider the efforts currently underway by the U.S. Army, which is now using the functional equivalent of predictive analytics to try to identify those soldiers most likely to commit a violent crime.
While it may sound like something out of a science fiction film, army researchers created a data pool consisting of 975,057 soldiers who served between 2004 and 2009, and 5,771 of which were convicted of violent felonies such as kidnapping, robbery manslaughter, or murder.
Utilizing a technique referred to as machine learning, they then began to look for any sort of patterns and, once these patterns were discovered, used them to create a so-called risk model identifying 24 factors common to the male soldiers with records of violent crimes. These factors included everything from being young and raised in poverty to disciplinary problems and recent demotions.
In order to test their risk model, the army researchers selected a random group of 43,248 soldiers who served between 2011 and 2013, and used the model to identify the 5 percent of the group it believed to be most at-risk. They subsequently found that this group was indeed behind a whopping 51 percent of the violent crimes committed among these 43,248 soldiers.
While more testing needs to be performed and the uses of the risk model remain unclear, the army researchers indicate that it could be used to help identify at-risk soldiers and provide them with personalized instruction designed to supplement the violence-prevention training that all soldiers already receive.
As impressive as this technology is, it is also somewhat disconcerting. Indeed, it raises a host of unanswered questions concerning privacy rights and, more significantly, the potential for abuse among those in charge. Here’s hoping we see these questions answered before anything like this is rolled out in a civilian capacity.
What are your thoughts?