Predictive modeling and business analytics is of course supposed to be cold and unbiased. But it is an oxymoron to say that there are inherent biases that predictive modeling is laden with. This is because the data warehousing done, followed by the data set understood for analysis, is chosen as per human selection. This selection process is full of biases, unknown to many. A method has now been developed by a statistics professor. It specifically tackles the issue of race- based biases. Criminal justice for instance is one area where the use of algorithms needs some form of extra precaution now. These algorithms must be considered primarily for providing us the inputs, on which we may process out information.


Uploaded Date:14 October 2019

SKYLINE Knowledge Centre

Phone: 9971700059,9810877385
© 2017 SKYLINE. All right Reserved.