Automators against automation?
August 14, 2014 7:26 AM   Subscribe

I am looking for research on, or organizing around, the ethics and social fallout of automation from the standpoint of the workers doing the automating (i.e., computer programmers and allied trades).

I'm thinking of things like: But really I'm looking for any interesting work or thinking in this area, or even useful search terms to use. Most everything I read on automation is either (a) interested only in its effects on the workers whose jobs are being automated, with no real regard for how the automated systems are built or by whom, or (b) only interested in automation from a software engineering standpoint. I'm having a hard time finding work that considers the producers, consumers, and subjects of automation holistically.
posted by enn to Society & Culture (7 answers total) 6 users marked this as a favorite
 
Best answer: Check out the Association for Software Testing, where there is very lively debate on what "test automation" means for the software testing community and, beyond that, for software quality in general. (The trend in many software development groups is to automate as much testing as possible. This poses some potential problems.)

The talks from this past week will soon be archived here. James Bach's keynote, especially the questions that follow, should provide a lot of grist for the mill.
posted by Sheydem-tants at 8:25 AM on August 14, 2014


'Programming yourself out of a job' is a common phrase in the IT industry while keeping systems creaky and cranky is known as 'job security'. It's an open secret in the trade that some IT professionals maintain parallel versions of both with a 'rip-cord' that will disable the automation with a dead man's switch should said worker be abruptly let go. The keys to the operation are usually handed off to a successor given there is a simple parting of ways. These are considered assets and being given one is usually a sign of trust and mutual respect.
posted by sunslice at 8:31 AM on August 14, 2014 [1 favorite]


Following on from sunslice, I take the question here to be about the ethics and various other aspects of potentially 'programming someone else out of a job'. I'm afraid I can't think of any reading material on that subject, but it's something that concerns me as well.

My work isn't large-scale enough to affect a lot of people in this way, but when it affects people's jobs, I always make an effort (and am lucky enough to usually have the means) to generate more useful work for the same role at the same time. If I didn't have the option to do that, it would certainly make me question the ethics of my job, though I wouldn't find it clear-cut by any means.

Good on you for thinking about this and I'd be interested to see if we can come up with any related material.
posted by Drexen at 9:04 AM on August 14, 2014


I liked this essay series by Venkatesh Rao: Entrepreneurs are the new labor.

It's not directly about automation, but it is specifically about tech entrepreneurs innovating new technologies, and discussing their instrumentalization at the hands of capital. Or, as he summarizes, "non-technical hustler founders of technology startups are the new labor." As an example, you get parts like this:

"The residual traditional engineering types who eschew computers will increasingly find themselves in the position of 19th century artisans who failed to reinvent themselves as engineers: the knowledge they need will move into the new tools, and they will become the new blue collar class. Actually, it is going to get worse. They may not be needed at all. At that level, computers are the new labor."

It's interesting and thoughtful at the very least.
posted by suedehead at 11:42 AM on August 14, 2014


Best answer: But really I'm looking for any interesting work or thinking in this area, or even useful search terms to use.

It sounds like you're pretty focused on the micro, you don't mention any of how this is analyzed at a wider/macro scale. Possibly because you're not interested in that, but it might help with the search terms and in finding research.

I've seen some interesting comparisons between Australia and New Zealand - two countries that are culturally-similar and geographic neighbors, but which have taken different paths regarding automation.
In New Zealand, there was heavier labour deregulation, becoming easier to hire/fire, lower minimum wage etc. than in Australia. The thinking was that this would make it easier to do more business and more business would generate more wealth. But over the subsequent decades, it was Australia that pulled ahead in wealth and standard of living, once appropriate adjustments were taken into consideration.

A theory with a lot of traction is that this is because of automation. Because employees were more expensive in Australia, it was cheaper or less risky to make one worker more productive than to hire two workers. So Australian businesses had an incentive to invest in automation, thus each worker becomes a more powerful generator of wealth, the economy pulled ahead and the national standard of living improved.

The USA by contrast seems to be dismantling the kind of social contract whereby sectors of the economy doing very well are a rising tide to lift all boats.
posted by anonymisc at 12:18 PM on August 14, 2014


Best answer: I'm a professional software engineer, and though I don't work on automation, I can think of some places you might want to start looking:

1) "Expert System" is a term from the field of artificial intelligence denoting a program emulating the decision making process of a human, e.g., in generating schedules. That may help when googling.
2) ACM and IEEE are the biggest US-based computing societies. If you're going to find a group of people organized to consider the ethics of automation, my bet is that it'd be in one of those two places (or more likely both).
3) Just as a piece of anecdata, I think the default approach of most any software engineer to these types of problems is going to default to "pragmatic" rather than "ethical" -- we tend to look first at the problem as an intellectual (can it be done? how?) and practical (if it can be done, how well?) exercise, rather than first considering any moral/ethical implications. This is how Skynet happens; there aren't many John Connors out there.
posted by axiom at 8:12 PM on August 14, 2014


Response by poster: Thank you all for the responses. I'm particularly looking forward to watching that software testing keynote; it seems like the tensions between automation as a tool to eliminate tedious tasks and automation as a threat to job security and control over one's own work would be particularly acute in the software testing field, so I expect the discussion will be very interesting.
posted by enn at 7:14 AM on August 15, 2014


« Older Is E-Mail legally considered mail?   |   Give me your most jargon-filled work-related... Newer »
This thread is closed to new comments.