Technological addiction was a problem even in the early days of computer programming, according to Dave Smith, while Peter McKenna says search engine algorithms alone are not to blame for gender bias
I found Moya Sarner’s article on digital addiction and her story of Lady Geek’s reverse ferret from digital guru to prophet of doom absorbing, timely, and somehow familiar (Is it time to fight the digital dictators?, 15 March). She also quotes Professor Mark Griffiths, director of the International Gaming Research Unit at Nottingham Trent University as having invented the term “technological addiction” in 1995. In 1971 I started a degree in maths, electronics and physics at Chelsea College, University of London which involved a certain amount of programming on the college’s Elliott 803 mainframe.
I remember clearly our lecturer warning us very sternly about the dangers of getting over-involved in programming, quoting the case of an earlier student who had spent so many nights in the computer room, addicted to getting his programs just-so, that he neglected all his other studies and eventually failed to make progress in anything. Remember that this was back in the days when our programs were written in Fortran on decks of hand-punched 80-column cards.
Throughout my career as an IT consultant (and teacher of system developers and designers) I have been very wary of the addictive qualities associated with the immediate feedback and dopamine stimulation of all digital devices. As a result, I don’t have a Facebook or Twitter account, my smartphone is mostly just used as a phone, and my tablet is mainly used to read the Guardian (and if that’s an addiction, I’m happy with that).
- Ivana Bartoletti’s case (Women must act now, or male-designed robots will take over our lives, 14 March) would be much stronger if she provided some concrete indications of bias in actual search engine algorithms, and proposed specific algorithmic modifications to “correct” bias confirmation or embody “intersectional thinking”. Instead, it seems much more likely that any bias lies in the source materials and those who produce and use them.
There is an old adage in computing circles that goes by the acronym GIGO: it means “garbage in, garbage out”. In 1864 Charles Babbage wearily remarked that he had twice been asked “Pray, Mr Babbage, if you put into the machine wrong figures, will the right answers come out?” That the vast majority of the biggest companies (with the most optimised websites and the deepest information reach) have male CEOs is not the fault of search engines which produce photos of men whenever “CEO” is entered in an image search.
Tay was not unlike the 1960s Eliza in that it basically repeated and rearranged verbal input for its responses. In Tay’s case, Twitter users deliberately tried to subvert Microsoft’s chatbot. And the “unprofessional hair” search results included many articles critical of bigoted attitudes against black hair styles. Most of the top hits now concern the story about the original search results.
We cannot blame tools for unpleasant characteristics of society. Indeed, ascribing responsibility so superficially shields us from realising the need for deeper societal change.