About This Blog
Automating Invention is Robert Plotkin's blog on the impact of computer-automated inventing on the future of invention and patent law.
- Artificial Invention
- Design & Engineering
- Evolutionary Computation
- Genie in the Machine
- History of Computing
- Human Creativity
- Intellectual Property Law
- Philosophy of Computing
- Software Patents
- Technology Industry
- AI and Future Enterprise
- The In Crowd
- The Search for Code-Free App Builders
- Upcoming Talk on Automated, Collaborative, and Distributed Inventing
- Yet Another Kind of Sourcing
- Inventing by and for the Masses
- Automation: Promoter or Destroyer of Creativity?
- Norbert Weiner on Computer Automation and Work
- IT Workers Increasingly Need "People" Skills
- Competing with People and Computers
- From computer programmers to "renaissance geeks"
- Automation and Innovation
June 11, 2009
AI and Future Enterprise
In Future Enterprise - the Intelligent Enterprise Revolution, futurist David Tow discusses the state of artificial intelligence in relation to e-commerce and global competitiveness. Tow lists the most promising AI trends as evolutionary or genetic algorithms, bayesian networks, fuzzy logic, swarm intelligence, neural networks, and intelligent agents.
Tow cites four major trends which are now emerging in relation to AI and the intelligent enterprise revolution:
1. AI is being used more frequently in e-commerce to achieve higher quality decision outcomes.
2. AI is moving up the decision chain to strategic and senior management levels.
3. AI techniques are being linked and used in more powerful combinations.
4. AI is beginning to leverage web intelligence from social networks, search services, and semantic applications.
See his article for more details and for his predictions on future trends. There is also an interesting analysis of Tow's ideas on the Genetic Argonaut.
April 29, 2009
The In Crowd
In an article in the March 2009 issue of the Communications of the ACM titled "Crowd Control," Leah Hoffman looks at the growing popularity of computer crowdsourcing applications. Crowdsourcing leverages the abilities of the human mind which aren't easily replicated by computers, including visual cognition and language processing. Crowdsourcing applications distribute tasks related to these abilities, attracting workers by using online games or by paying a small fee for completion of a task.
Crowdsourcing first received widespread recognition with the publication in 2004 of James Surowiecki's best-selling book The Wisdom of Crowds. The oldest and most well-known crowdsourcing application is Amazon's Mechanical Turk, which is a web-based platform which allows 'workers' to complete micro-tasks and receive payment (which is usually also 'micro').
Consulting companies are now available to help businesses and individuals interface with Mechanical Turk. Dolores Labs, based in San Francisco, sets up Mechanical Turk tasks for clients, then validates the results for quality and meaning. For more details on how crowdsourcing works, see the ACM article.
Read an earlier post about crowdsourcing here.
January 25, 2009
The Search for Code-Free App Builders
InfoWorld reports on the search by many business managers for cheap, do-it-yourself development tools which will allow them to sidestep IT organizations and implement their own business applications. Tools such as Coghead, Caspio, Zoho and Wufoo are the latest incarnations of frameworks which promise code-free implementation of applications. It remains to be seen whether the Holy Grail of codeless development is within the reach of business managers.
January 9, 2009
Upcoming Talk on Automated, Collaborative, and Distributed Inventing
We usually think of an "inventor" as someone who sits alone in a workshop, sketching designs and hammering out prototypes. In the future, individual inventorship will increasingly be overtaken by various forms of "collaborative inventing" as inventors leverage computer technology as an inventive tool. The talk will provide real-world examples of the phenomena that are changing the face of inventing.
June 11, 2006
Yet Another Kind of Sourcing
Arnold Brown, in an article in the July-August 2006 issue of The Futurist, uses the term "othersourcing" to refer to "the increasing ability to have work done not only off-site and by other entities . . . but by nonhumans," particularly robots. He provides various examples of ways in which software, robots, and other machines are being used to perform tasks that were once thought to require creativity and therefore to be the exclusive province of humans.
June 6, 2006
Inventing by and for the Masses
We've all heard of "outsourcing," "insourcing," "offshoring," "competitive sourcing," and "near-shoring." Now Wired is reporting on "The Rise of Crowdsourcing" -- the use of average people and their networked computers to create content, solve problems, and invent. The article is well worth a read.
Most relevant to this blog is the example of InnoCentive which, according to its web site, "is an exciting web-based community matching top scientists to relevant R&D challenges facing leading companies from around the globe. We provide a powerful online forum enabling major companies to reward scientific innovation through financial incentives." In other words, if a company has a technical problem that its own engineers can't crack, the company can post the problem on Innocentive's web site and award a prize to anyone -- from anywhere in the world -- who can solve it. It's bounty hunting for inventions, at $10,000-$100,000 a pop.
The Wired article goes into some detail on a particular inventor, Ed Melcarek, who has solved several problems posted on InnoCentive from the comfort of his one-bedroom aparatment in Barrie, Ontario. Why hire a team of high-priced engineers to solve a problem without any guarantee of success when you can farm out the work to garage inventors around tthe world and only pay on cash on delivery?
In the context of open source software, Eric Raymond said, "with enough eyeballs, all bugs are shallow." Perhaps now we can say the same thing about inventions.
May 26, 2006
Automation: Promoter or Destroyer of Creativity?
Robert L. Glass asksin the Communications of the ACM (subscription required) whether certain advances in software engineering have helped or hindered the creativity of the programmers who have benefitted from these advances. In particular, he acknowledges that those who were responsible for inventing these advances (such as high-order programming languages, operating systems, and modular programming) did something creative, but then questions whether these advances "enhanced the creative abilities of software engineers." His conclusion? "My answer, after some further reflection, remains 'yes.'"
We can ask the same thing about the impact of any kind of advance in automation or systematization. If someone were to automate big chunks of the process of writing a patent application (possibly putting me out of a job), it could take all of the creativity out of the process for someone who writes patent applications. Or it could remove the drudgery, allowing the writer to become creative at a whole new level. Which of these two possibilities occurs depends not only on the technology involved but also on the person using it.
January 12, 2006
Norbert Weiner on Computer Automation and Work
I've written several entries on the ways in which one economic effect of computer automation is to drive people to develop and market skills that have not yet been automated. Just to make clear that this topic is far from new, and that it has ethical as well as economic implications, consider the following statement about the human impact of computer automation from Norbert Weiner's 1950 book The Human Use of Human Beings: Cybernetics and Society:
We were here in the presence of another social potentiality of unheard-of importance for good and for evil.... It [technology] gives the human race a new and most effective collection of mechanical slaves to perform its labor.... However, any labor that accepts the conditions of competition with slave labor accepts the conditions of slave labor, and is essentially slave labor.... However, taking the second [industrial] revolution as accomplished, the average human being of mediocre attainments or less has nothing to sell that is worth anyone's money to buy. The answer, of course, is to have a society based on human values other than buying or selling.
Just some food for thought from a half-century ago.
January 11, 2006
IT Workers Increasingly Need "People" Skills
To be successful in the IT industry 20 years ago all you needed were computer skills. Now, if you want job security in IT you will need to "hone your business and project-management skills," according to this article from InformationWeek. If this sounds familiar to you, it may be because I've harped on this point several times before (such as here and here).
The challenge not only for IT workers, but for everyone, is to stay one step ahead of automation. Blacksmiths and millwrights learned this lesson the hard way a long time ago, but the "high-tech" workers of today (who are quickly becoming the "low-tech" workers of tomorrow) seem to just be getting the point.
December 29, 2005
Competing with People and Computers
In a ComputerWorld piece entitled, "What Tech Skills are Hot for 2006?," Thomas Hoffman says that the threat to U.S. IT workers from outsourcing is not as great as most people think. He quotes Craig Symons of Forrester Research Inc. as saying that "most of the stuff that's going offshore is low-level coding jobs." This leaves a demand for business analysts and IT relationship managers in the U.S., among others. In particular demand are people with experience in a specific industry.
This effect of outsourcing tech work to human workers in other countries is the same as the effect of automating tech work using machines. Once your skills can be performed more quickly, less expensively, or more quickly by someone else--whether the "someone" is a human or a machine--your only chance at success is to compete on the basis of higher-level (or otherwise specialized) skills that have not yet become automated or outsourceable less expensively. This is just as true for businesses as it is for individuals.
One might try to maintain job security in such an environment in several ways. Lawyers do it in part by creating legal barriers to entering the field (e.g., rules prohibiting the unauthorized practice of law) and corresponding economic barriers (e.g., the high cost of obtaining a license to practice law). But the guild approach is rapidly breaking down for reasons I won't go into here.
One might try to predict which skills will elude automation and outsourcing for a long time and then sell products and services utilizing those skills. But predicting the future is a tricky business. Just ask a travel agent, encyclopedia publisher, or anyone in the music industry how successfully they predicted the impact of computer technology on their businesses over even the last five years.
Another strategy, which can of course be combined with the first two, is to develop the ability to adapt rapidly to changing conditions. It is important to keep in mind, however, that in this context the changing conditions include not only the spread of technological skill to people worldwide, but also the development and spread of technology that itself can perform tasks previously only capable of being performed by humans. If you are a technologist developing a technology that has the potential to displace you, you should at least consider keeping it to yourself. Then you can use it to boost not only your productivity but perhaps your job security as well.
August 23, 2005
From computer programmers to "renaissance geeks"
News.com comments on a piece in the New York Times arguing that computer programmers increasingly need skills in other fields, such as biology, business management, entertainment, and marketing. Although offshore outsourcing may be partly responsible, computer automation is also a driver of the demand for new skills. As "lower level" skills become automated, human programmers are finding it increasingly necessary to develop skills, and combinations of skills, that cannot currently be replicated by computers in order to maintain demand for their labor.
Automation and Innovation
Hillel Levin at PrawfsBlog comments on the decreasing extent to which computer users need to understand how computers work. He asks whether "it follow[s] that the pool of potential programming innovators is likely to dwindle, since there are going to be fewer school-age kids who have a basic understanding" of how computers work.
I think the answer is "no" if we understand a "programming innovator" to be someone who uses a computer to create innovative computer programs. As genetic programming and other techniques for automating the creation of computer programs improve, we will likely see an increase in computer innovations even as the need for old-fashioned programming skills -- such as the ability to hand-code the individual instructions that make up an algorithm -- decreases.
Remember that in the early days of computing, the term "assembler" referred to a person (usually a lowly graduate student) who hand-translated instructions in an assembly language into binary instructions in a computer machine language. Now that software "assemblers" automate the process of assembly, few humans know how to perform this function. Rather than reduce innovation, this increased ignorance of low-level technical details has spurred innovation by enabling computer programmers to focus their efforts on high-level problem solving rather than low-level implementation details.
July 28, 2005
Are college students abandoning computer science too quickly?
I can't fault any college student for seriously rethinking the value of a computer science degree in light of current, and likely future, economic conditions. But "computer science" isn't monolithic. Evolutionary computation, and the skills required to excel at it, differ in many ways from "traditional" computer science. And there are good signs that evolutionary computation is finding a foothold in private industry, as indicated by recent postings here. Perhaps those new college enrollees should seek to reposition themselves within computer science, rather than jump ship completely.
But minoring in business as a hedge probably wouldn't hurt either.
July 19, 2005
Pink: Conceptual age causes programmers to reconceptualize career options
I just saw this posting by Dan Pink citing to an AP story "on the dimming luster of programming jobs." According to the article, "many new entrants into the U.S. work force see info tech jobs as monotonous, uncreative and easily farmed out -- the equivalent of 1980s manufacturing jobs."
Automation tends to do that. I always thought it odd when programmers would claim (until recently) that computer programming was inherently more creative than other kinds of science and engineering. It should have been clear then, and the evidence is mounting now, that it was a peculiar combination of technological and economic circumstances that enabled the typical programmer to effectively demand such a large degree of on-the-job creative freedom as a condition of employment.
If programmers had any doubt that their jobs could be made less creative and even eliminated by the very technologies they brought into existence, they need only have asked their neighborhood blacksmith.
July 18, 2005
A Whole New Mind by Daniel Pink
A Whole New Mind by Daniel Pink is well worth reading if you're interested in the topics covered by this blog. Pink's argument is that holistic thinking, and a variety of skills associated with it, will become increasingly economically valuable in the coming "Conceptual Age." A relatively small portion of the book is dedicated to substantiating this claim. Most of the book focuses on describing the "six senses" -- the set of aptitudes that you will need to succeed in the Conceptual Age -- and on providing practical ways for individuals to sharpen those senses. (The six senses are Design, Story, Symphony, Empathy, Play, and Meaning.)
Pink identifies three drivers of the Conceptual Age: Abundance, Asia, and Automation. He draws a useful analogy between the defeat of the iconic John Henry by an automated steam drill and the defeat of chess grand master Garry Kasparov by the IBM supercomputer Deep Blue. Pink concludes the analogy:
Last century, machines proved they could replace human backs. This century, new technologies are proving they can replace human left brains.
I agree. But Pink's conclusion doesn't go far enough. Artificial creativity is proving increasingly able to replace human right brains. For example, human programmers were still required to program the incarnation of Deep Blue that defeated Kasparov. Kasparov may have his revenge when Deep Blue's programmers are put out of work by a genetic algorithm that evolves winning chess playing strategies. Although we're not there yet, Moshe Sipper and his colleagues have made some great strides.
In the final analysis, my extended analogy is still consistent with Pink's general thesis -- that people will need to develop higher-level conceptual skills in the coming century to remain competitive. Deep Blue's programmers' best bet for keeping their jobs in the long term is to learn how to write genetic algorithms that produce chess-playing code, rather than continuing to fine tune their skills at writing chess-playing code itself.
June 29, 2005
Human-competitive software meets resistance from human competitors
There were quite a few presenters from private industry at GECCO this week. I had the chance to attend presentations by Erik Goodman (wearing his Red Cedar Technology hat), Thomas Baeck of NuTech Solutions, and Douglas J. Newell of Genalytics. Unfortunately, I could not attend the others but have been reading as many papers as I can.
One thing that struck me about some of these companies is that although they are able to use evolutionary computing to design products and solve other problems faster, cheaper, and better than ever before, they often face resistance from prospective clients. More specifically, they face resistance from the engineers, statisticians, and others who essentially fear losing their jobs to software.
In one sense, this is nothing new. New technology has always made it possible to displace human labor. What is worth noting is that technological automation is moving higher up the skill ladder, increasingly making it possible to automate functions -- such as those performed by engineers and computer programmers -- that were previously thought to require not only skill but also creativity and hence to be "safe" from the encroachment of automation. I believe this is one of the theses of A Whole New Mind by Daniel Pink, which I plan to read soon and comment on here.
June 23, 2005
Who "writes" a reality TV show?
WBUR reported this morning (the same story is being covered by Reuters and others) that the Writers Guild of America (WGA) has launched a campaign to gain a labor contract for writers of reality TV shows. Reality TV producers are objecting to such a contract, in part on the basis that the people seeking a contract aren't "writers."
What's the connection between this and automated inventing? Consider the following (from the Reuters story):
Instead of writing dialogue, reality TV writers say they help craft the overall sense of story. According to the union, this includes casting, creating scenarios, conducting field interviews and guiding the postproduction process so hundreds of hours of video end up with a meaningful beginning, middle and end.
For that reason, video editors feel they are equally deserving of WGA coverage.
"These stories come together in post (-production) -- stories are pulled out by us, in collaboration of course with storytellers -- but we're in there creating stories so it's a logical conclusion to be part of the Writers Guild," said editor Donna Egan, who also is helping organize this campaign. "A lot of it is just about having basic benefits -- health and pension. We have to change the system because the system isn't going to change voluntarily."
Is someone who works on a reality TV show a "writer" because he or she creates the environment in which a reality TV show plays out? This is similar to the question whether someone who writes automatic script-writing software is the "author" of the resulting scripts, or whether someone who writes automatic machine-designing software is the "inventor" of the resulting machines.
Whatever the answer to these questions, now at least I can justify watching "Fear Factor" as a way of conducting research into automated inventing.
June 13, 2005
What's in a name? Computer "science" vs. "engineering"
Illigal Blogging has a nice post criticizing the initial choice to use the word "science" (as in "computer science") to describe what computer people do. (I'll use the term "computer people" in this post to avoid any bias toward "science" or "engineering.") I agree with the author's assertion that the use of "science" instead of "engineering" is unfortunate because it fails to capture the significant ways in which computer scientists/engineers use computers to develop new solutions to real-world problems.
There has been similar debate about the use of the term "machinery" in the name of the computer profession's preeminent association: The Association for Computing Machinery. R.W. Hamming, in his acceptance speech for the 1968 Turing Award (ACM Digital Library subscription required), said:
At the heart of computer science lies a technological device, the computing machine. Without the machine almost all of what we do would become idle speculation, hardly different from that of the notorious Scholastics of the Middle Ages. The founders of the ACM clearly recognized that most of what we did, or were going to do, rested on this technological device, and they deliberately included the word “machinery” in the title [of the ACM]. There are those who would like to eliminate the word, in a sense to symbolically free the field from reality, but so far these efforts have failed. I do not regret the initial choice. I still believe that it is important for us to recognize that the computer, the information processing machine, is the foundation of our field.
The focus on the "scientific" and theoretical aspects of what computer people do has affected how the law has viewed computer software. For example, an argument that continues to be raised against the patenting of software to this day (and particularly strongly in Europe at the moment) is that software is "abstract" or "intangible" and therefore lacking in the "practical" or "technical" nature required for patent protection. I think that this dispute about whether software is "abstract" (and hence not patentable) or "technical" (and hence susceptible of patent protection) has some of the same roots as the debate about whether computer people are doing science, mathematics, engineering, some combination of them, or something completely different.
Although what many computer people do what is reasonably classified as "science," many are using computers to design new solutions to practical problems -- in other words, to engage in engineering. The debate over legal protection for software needs to incorporate a more nuanced understanding of what computer people do if there is to be any hope of a rational resolution to that debate.