In today's Times Edward Lucus writes "Tech Giants must come clean with us - Too many decisions about our careers, love-lives and credit-worthiness are being made by secretive online algorithms."
He is right to point out that the use of computers, particularly by very large of powerful companies, to make decisions which affect our lives needs watching - but on the online comments page I have pointed out that decisions made by humans may not be any more reliable. I wrote:
But are humans any more reliable? We all have biases and make generalizations which have little or no foundation in reality.To give an example. Before I retired I worked in a university teaching Computer Science on a sandwich course basis - which meant that the department regularly had to find about 90 placements for students for "on the job" training with mainly local firms. Almost invariably the last 10 or 20 to be placed included a disproportionate number of students who either had foreign-sounding surnames or were not white anglo-saxon in appearance - irrespective of how well they were doing on the course.One of my personal first year tutees, who had just failed an interview for a job working with a computer in a sales department with a small firm, asked me whether there might have been racial discrimination. What seems to have happened in the interview was that the computer manager realised that the student was no familiar with commercial English (for example the difference between "invoice" and "statement") - and probably assumed (wrongly) that as he looked foreign he did not understand English. While of course the manager might have been directly discriminating on the grounds of race it was far more likely that he had not realised how little the average 18 year old knew of commercial jargon, and jumped to an inappropriate conclusion.A very different example, where I nearly acted on an inappropriate "racist" assumption. 50 years ago we lived in a small town where the population was almost exclusively white. We went to a family wedding in London, taking with us our 2 year old daughter. We took it for granted that on one side of the aisle nearly everyone would be of european origin - and on the other side nearly everyone would be of asian origin. As everyone was waiting for the bride (who was five minutes late) my daughter suddenly stood up in the pew and pointed towards the people on the other side of the aisle and shouted "Look Mummy, look." I looked to see where she was pointing to see what she had seen to make her get excited. All I could see was the crowd of asians and before I could grab her and put my hands over her mouth to stop her making a raciest comment she shouted out "There's Mary with baby Jesus." What was new and exciting to her was that she had never been in a Roman Catholic church before!While I am concerned with "Black box" computers making decisions it is likely that those decisions reflect the biases of the programmers who designed the system OR are based on the statistical analysis of "Big Data" and are likely to be more reliable than a human. As I see it the problem is that the computer systems making the decisions are "black boxes" and cannot explain what it is doing in a way that those can understand.(In any case, if someone make a racist comment to you - would you ask them why they said it - and would you really expect an honest reply in every case.)