jump to navigation

Lena, Fabio, and the Mess of Computer Science April 11, 2018

Posted by Peter Varhol in Publishing, Software development, Technology and Culture.
Tags: , , , ,
add a comment

The book Brotopia opens with a description of Lena, the November 1972 Playboy centerfold whose photo by chance was used in early research into image processing algorithms at USC.  Over time, that singular cropped image became a technical standard to measure the output of graphics algorithms.  Even today it is used in academic research to point out details of the value of alternative algorithms.

But today this image is also controversial.  Some complain that it serves to objectify women in computer science.  Others say it is simply a technical standard in the field.  A woman mathematics professor applied similar graphics algorithms to Fabio in an attempt to bring some balance to the discussion.

In the 8th grade (around the time of Lena), my middle school (Hopewell Junior High School) partitioned off boys to Shop class, and girls to Home Ec.  Perhaps one boy a year asked for Home Ec class, but it could only be taken by boys as a free elective, and was viewed as an oddity.  During my time there, to my knowledge no girl asked to be in Shop class.

Of course, I thought nothing of it at the time, but today such a segregation is troubling.  And even in 2015, a high school computer science class used Lena to show off their work with graphics algorithms, to mixed reviews.

There are many serious problems with the cult of the young white male in tech today.  As we continue to engage this demographic with not-so-subtle inducements to their libidos, we also enable them to see themselves as the Masters of the (Tech) Universe.  That worked out so well for the financial trading firms in the market failures of the 1980s and 2000s, didn’t it?

Does the same dynamic also make it more difficult for women to be taken seriously in tech?  I think that it is part of the problem, but by no means the only part.  Women in tech are like people in any field – they want to do their jobs, and not have to have cultural and frat boy behaviors that make it that much more difficult to do so.

I’ve been fortunate to know many smart and capable women throughout my life.  I had a girlfriend in college who was simply brilliant in mathematics and chemistry (in contrast, I was not brilliant at anything at that point in my life).  She may have been one of the inspirations that led me to continue plugging away at mathematics until I managed a limited amount of success at it.  Others try to do their best under circumstances that they shouldn’t have to put up with.

So let’s give everyone the same chance, without blatant and subtle behaviors that demean them and make them feel less than what they are.  We don’t today.  Case in point, Uber, which under Travis Kalanick was the best-known but by no means the only offender.  I hope we can improve, but despair that we can’t.

About Computer Science and Responsibility March 31, 2018

Posted by Peter Varhol in Strategy, Technology and Culture.
Tags: , ,
add a comment

Are we prepared to take on the responsibility of the consequences of our code?  That is clearly a loaded question.  Both individual programmers and their employers use all manner of code to gain a personal, financial, business, or wartime advantage.  I once had a programmer explain to me, “They tell me to build this menu, I build the menu.  They tell me to create these options, I create these options.  There is no thought involved.”

In one sense, yes.  By the time the project reaches the coder, there is usually little in doubt.  But while we are not the masterminds, we are the enablers.

I am not sure that all software programmers viewed their work abstractly, without acknowledging potential consequences.  Back in the 1980s, I knew many programmers who declined to work for the burgeoning defense industry in Massachusetts of the day, convinced that their code might be responsible for war and violent death (despite the state’s cultural, well, ambivalence to its defense industry to begin with).

Others are troubled by providing inaccurate information being used to make decisions, or by trying to manipulate people’s emotions to feel a particular way, to buy a particular product or service.  But that seems much less damaging or harmful than enabling the launch of a nuclear-tipped ballistic missile.

Or is it?  I am pretty sure that most who work for Facebook successfully do abstract their code from the results.  How else can you explain the company’s disregard of personal reaction to their extreme intrusion into the lives of their users?  I think that might have relatively little to do with their value systems, and more to do with the culture in which they work.

To be fair, this is not about Facebook, although I could not resist the dig.  Rather, this is to point out that the implementers, yes, the enablers, tend to be divorced from the decisions and the consequences.  To be specific:  Us.

Is this a problem?  After all, those who are making the decisions are better qualified to do so, and are paid to do so, usually better than the programmers.  Shouldn’t they be the ones taking the responsibility?

Ah, but they can use the same argument in response.  They are not the ones actually creating these systems; they are not implementing the actual weapons of harm.

Here is the point.  With military systems, we are well aware that we are enabling war to be fought, the killing of people and the destruction of property.  We can rationalize by saying that we are creating defensive systems, but we have still made a conscious choice here.

With social systems, we seem to care much less that we are potentially causing harm than in war systems.  In fact, the likes of Mark Zuckerberg still continue to insist that his creation is used only for good.  That is, of course, less and less believable as time marches on.

And to be clear, I am not a pacifist.  I served in the military in my youth.  I believe that the course of human history has largely been defined by war.  And that war is the inevitable result of human needs, for security, for sustenance, or for some other need.  It is likely that humanity in general will never grow out of the need to physically dominate others (case in point, Harvey Weinstein).

But as we continue to create software systems to manipulate people, and to do things that make them do what they would not otherwise do, is this really ethically different than creating a military system?  We may be able to rationalize it on some level, but in fact we also have to acknowledge that we are doing harm to people.

So if you are a programmer, can you with this understanding and in good conscience say that you are a force for good in the world?

Design a site like this with WordPress.com
Get started