About This Blog
Automating Invention is Robert Plotkin's blog on the impact of computer-automated inventing on the future of invention and patent law.
- Artificial Invention
- Design & Engineering
- Evolutionary Computation
- Genie in the Machine
- History of Computing
- Human Creativity
- Intellectual Property Law
- Philosophy of Computing
- Software Patents
- Technology Industry
- Biological Models for Robot Design
- A Theory of Evolution for Technology
- Programmable DNA Computers
- Automating Software Design
- Power was the Focus at the Design Automation Conference
- Extending Moore's Law
- A Shift in Microsoft's Focus
- Blurring the Line Between Hardware and Software
- Brain-Computer Interface Technology
- DIY Robots at Maker Faire
- The ICARO Aeronautical Design Project
- More Supercomputer Time for Simulation Research
- December 2009
- November 2009
- October 2009
- September 2009
- August 2009
- July 2009
- June 2009
- May 2009
- April 2009
- March 2009
- February 2009
- January 2009
- December 2008
- November 2008
- October 2008
- September 2008
- August 2008
- July 2008
- February 2008
- June 2006
- May 2006
- February 2006
- January 2006
- December 2005
- November 2005
- October 2005
- September 2005
- August 2005
- July 2005
- June 2005
December 1, 2009
Biological Models for Robot Design
MIT professor Sanbae Kim is studying the animal kingdom to find biological models for robot designs. Working with Stanford professor Mark Cutosky, Kim has designed robots like the Stickybot, named by Time Magazine as one of the best inventions of 2006. Inspired by a gecko, the Stickybot uses directional adhesive foot pads to climb smooth vertical surfaces like glass. Another robot name iSprawl uses motion inspired by cockroaches. Kim's latest project uses the running style of the cheetah as the model for its locomotion. Kim and a team of four MIT graduate students are creating a prototype robot using a lightweight carbon-fiber-foam composite. In contrast to most robots that use wheels for motion and are relatively slow, with top speeds of about 5 mph, the cheetah-inspired robot will be designed to run at 35 mph. According to Kim, the challenges of the project include replicating the structure of the cheetah and designing a motor that can quickly provide enough power for the robot to sprint.
November 12, 2009
A Theory of Evolution for Technology
In New Scientist, complexity theorist W. Brian Arthur proposes a theory of evolution for technology. The theory was first called for by Victorian novelist Samuel Butler just four years after Darwin wrote On the Origin of Species. Arthur believes that a new theory must be developed, rather than borrowing from Darwin's theories about biology. He suggests that a mechanism that he calls combinatorial evolution is what causes new technologies to arise from combinations of existing technologies. Read more about Arthur's theory on the New Scientist website.
October 31, 2009
Programmable DNA Computers
A team of researchers at the Weizmann Institute of Science in Israel have been working with biomolecular computers made of DNA and other biological molecules. Working under Professor Ehud Shapiro, the team introduced the first programmable DNA computing device in 2001. The device is amazingly small - a trillion can fit in a drop of water. In a recent paper published in Nature Nanotechnology, the team describes the design of an advanced logical program for the device, as well as a compiler. Read more about the DNA computers at this Weizmann Institute website.
October 25, 2009
Automating Software Design
A major European research project has developed a system that helps software developers take new software from the drawing board to executable code. The VIDE project (for VIsualize all moDel drivEn programming) began with the goal of making the development of executable software a single process instead of a sequence of separate activities. Funded by the European Union, the project has been underway for two years and has produced a design and development toolkit that is reported to simplify, speed up and decrease the cost of the creation of high-quality and easily-modifiable software. The system is based on a pre-existing modeling language called UML, for Unified Modeling Language.
August 27, 2009
Power was the Focus at the Design Automation Conference
At the 46th Design Automation Conference, which was held the final weekend in July in San Francisco, low-power design was one of the central themes of workshops, tutorials, presentations, meetings, and technical tracks. Power is currently one of the biggest challenges for designers, especially when balanced against the demand for more power in portable devices. The need for design techniques that lower static and provide dynamic power consumption continues to be one of the most popular topics at DAC. Attendees voted for "Power Scavenging: Waste Not, Want Not" as their favorite topic of the conference.
August 23, 2009
Extending Moore's Law
According to the theory of Moore's Law, the number of transistors that can be placed on an integrated circuit doubles every two years. The theory, named for Intel co-founder Gordon Moore, is believed by many to be reaching its limit. Now a Rice University laboratory is manipulating molecules in such a way as to extend the theory for a few more years. A research team led by Rice University professor James Tour has published a paper which outlines a way for chip designers and manufacturers to continue to miniaturize transistors, circuits, and microprocessors.
August 20, 2009
A Shift in Microsoft's Focus
In an Infoworld article, Microsoft chief officer for strategy and research Craig Mundie predicts that computers of the future will be programmed to serve people automatically, rather than reacting to human instructions. "I've lately taken to talking about computing more as going from a world where today they work at our command to where they work on our behalf," Mundie is quoted as saying. This focus indicates a shift in Microsoft's strategy towards user interfaces for computers (including gesture, voice and touch interaction).
At Microsoft's annual Field Summit in Spokane, Washington, Mundie showed a video of a prototype digital personal assistant. The image of a person on a computer screen was used to gather information and perform tasks. Mundie also presented a video on the digital office of the future, where files, a whiteboard, and presentation area are all represented by 3D projections. Interaction would take place through touch and gesture interaction. Mundie called these demos "half smoke and mirrors and part real," but stated that they are possible with current technology.
August 16, 2009
Blurring the Line Between Hardware and Software
Writing on the Foresight Institute website, J. Storrs Hall discusses how the boundary between hardware and software is becoming "fuzzier" as systems become more complex and nanotechnology becomes more important. With future use of nanocontrollers, the complexity of mechanical systems will accelerate to the point that "matter compilers" will be required for the design. This means that the nanotechnology designer will be using the same processes to design nanotechnology that today's software developers use to design and implement software. Dr. Hall predicts that the ability to write reliable software will become more and more important in the coming world of nanotechnology. If he is right, this is further evidence that the problems that software has caused for patent law will begin to creep into the application of patent law to nanotechnology for the same reasons.
Dr. Hall is a leader in the field of molecular nanotechnology and president of the Foresight Institute. He is also known for coining the term Utility Fog, which is a hypothetical collection of nano-robots that unite to form a solid mass in the shape of any desired object.
July 27, 2009
Brain-Computer Interface Technology
Researchers at the University of Washington are researching Brain-Computer Interface technology with the goal of connecting people with computing devices through brainwaves. It may sound like science fiction, but researchers have already demonstrated that brainwaves can be decoded and fed to computers which are programmed to understand them. In 2002, a neurologist name Eric Leuthardt and a biological engineer named Daniel Moran developed video games that can be played by a human imagining an action. Now Leuthardt and other researchers are working on ways to decode human thoughts encoded as speech in the mind with the goal of changing the way we interact with computers and other devices, including prosthetic limbs.
July 23, 2009
DIY Robots at Maker Faire
At the annual Maker Faire this spring in San Mateo, California, do-it-yourself robotics was one of the largest exhibit categories. Twenty-four individuals and groups had working robots on display. Two of the popular events at the Faire were the RoboGames, where robots were pitted against each other in combat and RoboMagellan, where robots used GPS to navigate obstacle courses. The function of many of the robots tended towards whimsical, with robots that make cocktails on demand and robots that are replicas of Star Wars robots being two of the most popular genres. The proliferation of DIY robotic kits is credited with the boom in homemade robots at the Faire and could be the impetus for a mainstream movement.
June 17, 2009
The ICARO Aeronautical Design Project
The ICARO project has been undertaken by the Basque corporation Tecnalia and the Spanish corporation Aernnova to develop Multidisciplinary Optimization (MDO) software to optimize the design of aircraft wings. The goal of the project is to reduce design variables such as manufacturing costs and weight and to develop innovations in aeronautical structures in composites for future aeronautical programs.
The design process for aircraft wings is complex, requiring diverse factors such as mechanical loads, vibrations, fluid dynamics, and temperature to be balanced against viability and cost. MDO software aids the process by repeatedly calculating factors such as heat flow temperatures, finding valid designs, and then factoring in variables such as weight and cost.
June 15, 2009
More Supercomputer Time for Simulation Research
Wired Science reports that starting in 2010, the Department of Energy will make available to the science community at large additional processor time on two of its supercomputers. A total of 1.3 billion processor hours will be set aside on Oak Ridge National Laboratory's Cray XT "Jaguar" and Argonne National Laboratory's IBM Blue Gene/P "Intrepid." These two supercomputers are among the most powerful in the world and are dedicated to open scientific research. The additional research hours will allow astrophysicists, biologists, and other scientists to perform complex simulations involving multiple simultaneous physical phenomena.
June 9, 2009
Crowdsourcing with The Energy Crowd
The Energy Crowd is a crowdsourcing web site which is attempting to create new sustainable energy technologies by gathering input from engineers, designers and the general public. The goal is to bring together individuals who might otherwise be working in isolation. Participants may be highly skilled but are working without remuneration. A project which is currently underway is the design of a solar heating system for the home which will capture energy from sunlight during the summer and store it to heat the home during the fall and winter. If a commercially-viable solution is arrived upon, it will be made available via a General Public License (GPL).
The site also publishes online a collection of news and information about sustainable energy called 101 Ways to Kick the Carbon Habit.
May 25, 2009
Twitter on the Brain
Twitter has become habit-forming for many people, but only Adam Wilson can claim to have posted on Twitter using his brain waves. His message, "using EEG to send tweet," was actually a demonstration of brain-computer interface technology. Wilson, a University of Wisconsin biomedical engineering doctoral student, is interested in providing communication interfaces for people with "locked-in syndrome," meaning their brains are functional but communication is hampered by injury or a medical condition.
Wilson used an electrode-studded cap wired to a computer to send electrical signals from his brain to an on-screen keyboard. Focussing on a letter caused it to be 'typed' on the keyboard. The Twitter experiment was one of the first to tie a brain-computer interface to Internet technology. Justin Williams, a UW professor and Wilson's advisor, hopes that this demonstration will inspire other researchers to focus on inventions that will help the daily life of people with neurological deficits.
May 17, 2009
Simulating Auto Assembly
Problems related to robotic car assembly include determining if new components will fit into the car assembly and then finding the most efficient way for the robot to install the component. New CAD software designed by European scientists performs virtual installation of car components, detecting when problems will occur and offering advice on how the component should be redesigned. The software is also capable of simulating flexibility in components, factoring the amount of bend in the component into assembly planning. The interactive program allows changes to components to be applied and previewed in a matter of seconds.
The software was developed and refined at the Fraunhofer-Chalmers Research Centre for Industrial Mathematics FCC in Gothenburg, Sweden, and the Fraunhofer Institute for Industrial Mathematics ITWM in Kaiserslautern, Germany. There is a plan to make this software commercially available before the end of the year.
May 11, 2009
Web Cubed: Everything will be Connected
According to futurists, the next generation Internet will be built on networks linking not only computers and telecommunications devices, but many other products including clothing, cars, and all types of personal electronic devices. A new paradigm to support what is known as "cubed networking" will be needed to move beyond Web 2.0, as well as new technology to support users by dynamically adapting to each user's environment.
New technology always requires new standards, and there are currently no standards bodies addressing networks with the potential to support billions of dynamic heterogeneous devices. As a start, several European companies have embarked on a collaborative effort called BIONETS (BIOlogically inspired NETwork and Services). The goal of the BIONETS project is to lay down the foundation for the design of the complex large scale networks which will be required by the Internet of the future.
According the BIONETS project coordinator Daniele Miorandi, "The first problem is scale. A network capable of linking everything together will be huge, and it will take some serious engineering to create a framework and platform capable of attaining this sort of scope." The philosophy behind BIONETS is to use models of complex hetergeneous systems found in nature as a source of inspiration.
March 25, 2009
Open-Source Software Speeds up Molecular Simulations
The simulation of molecular motion is a critical element in the study of diseases such as Alzheimer's and Parkinson's and in the development of vaccines. In the past, this simulation required large amounts of computing power, restricting researchers to using supercomputers or to breaking up the computations for processing on less powerful desktop computers. Now, thanks to an open-source software package developed at Stanford, molecular motion simulation can be done on desktop computers at higher speeds than ever thought possible.
"Simulations that used to take three years can now be completed in a few days", said Vijay Pande, an associate professor of chemistry at Stanford and principal investigator of the Open Molecular Mechanics (OpenMM) project. OpenMM is able to perform accelerated simulations by taking advantage of inexpensive graphics processors (GPUs). An added bonus is that the software works with any brand of GPU, meaning that the simulations can be done on most laptop and desktop computers.
March 1, 2009
A team of engineers at Carnegie Mellon University has developed software that supports creation of 3D designs by sketching on a tablet computer. The SketchCAD software is intended to give users more creative freedom and a shorter learning curve. Future users of SketchCAD may include physicians planning surgeries as well as mechanical engineers designing products such as cars.
Some industry experts have pointed out that tablet input devices for CAD were first introduced in the 1960s, but were eventually replaced by the ubiquitous mouse. According to Levent Burak Kara, one of the project leaders, this is a new take on an old tool. "The idea is to empower engineers and designers with tools that are already familiar to them and are the most natural for the task."
February 13, 2009
Adrian Thompson is a scientist on the faculty at the University of Sussex who specializes in "Evolutionary Electronics", which is the use of genetic algorithms in the design of electronic system. Thompson calls this type design of evolutionary because it resembles natural selection, with "selection acting repeatedly upon heritable variation." He has researched the idea of self-designing circuits which could be used to build neural network chips.
February 9, 2009
Open Innovation and the Raymond Conference
An article in BusinessWeek describes how the concept of open innovation in business was put into practice at the fifth annual Raymond Conference in Rotterdam, held in February 2008. The conference was attended by 17 design managers from companies such as Heineken, Lego, Airbus, and Hewlett-Packard. Conference attendees were asked to collaborate as if they were all part of a single global design company. Their mission was to deliver the best, fastest and most inexpensive design solutions for a broad range of businesses. Designers experienced the chance to look at their businesses from another industry's perspective and learned new ways to share knowledge among designers.
January 13, 2009
Molecular RNA Computers
Two scientists at California Institute of Technology have developed molecular computers which self-assemble from RNA within living cells. The RNA computers currently perform low-level logic operations, but may someday be programmable and able to react to specific conditions within a cell. Christine Smolke, who carried out the research with Maung Nyan Win, suggests that smart drug delivery systems could be designed which would target cancer cells. The full results of the research were published in Science.
January 9, 2009
Upcoming Talk on Automated, Collaborative, and Distributed Inventing
We usually think of an "inventor" as someone who sits alone in a workshop, sketching designs and hammering out prototypes. In the future, individual inventorship will increasingly be overtaken by various forms of "collaborative inventing" as inventors leverage computer technology as an inventive tool. The talk will provide real-world examples of the phenomena that are changing the face of inventing.
Putting Quantum Devices to the Test
A team of researchers at the University of Calgary has developed a new methodology for testing quantum devices. Their breakthrough is described in the journal Science and its online publication Science Express. Quantum computers, considered to be the next step in the evolution of computer technology, have been stalled in part by the lack of test processes for components in a quantum system. The new test methodology, which uses standard laser and lens techniques, will allow the behavior of optical quantum processes to be quantified. Supercomputers and communication networks based on quantum physics are now one step closer.
December 27, 2008
Programming Molecular Computers
The field of "molecular computing" seeks to make computers constructed from individual molecules. Kirk L. Kroeker reports on a variety of recent advances in molecular computing in the most recent issue of the Communications of the ACM.
Of particular interest to readers of this blog is that Caltech Professor Niles Pierce and his colleages "are working on the algorithms needed to create what he calls 'a compiler for molecular computing' that will take as input a high-level abstraction of the desired function for a molecular system and produce as output molecular sequences that can be synthesized to execute the function in a test tube or cell." This transformation of high-level abstraction into a concrete description of a physical structure that embodies the abstraction is what I refer to as "the fundamental structure of a wish" in The Genie in the Machine. The description of the high-level abstraction is like a wish and the computer that transforms the abstraction into a molecular sequence is like a genie that grants the wish. The history of computer science demonstrates the power of this paradigm, which is why every time a new kind of physical computing device emerges, we see efforts to create artificial genies for use with the new computing device.
December 25, 2008
Print Your Own Products
Professor Thomas A. Easton writes in the latest issue of The Futurist about how improvements in three-dimensional printing technology and new business models (such as that of Ponoko, previously reported on here) are enabling products to be manufactured on-demand from digital design files. He imagines a not-too-distant future in which consumers will have 3D printers, or "fabbers," right on their home desktops, ready to manufacture products purchased online in an instant.
For more on this and other related topics, check out Prof. Easton's blog, Technoprobing.
November 19, 2008
Outsourcing Manufacturing Isn't Just for Large Companies Anymore
Are you an individual designer or inventor who wants to earn a living from selling your products but who doesn't have the time, inclination, or money to sell your products yourself, and who wants to be your own boss? No worries. Wired magazine reports that a company named Ponoko will let you upload your designs to them in digital form. They will then market your products for you. When a customer purchases your latest chair, Ponoko will use its laser cutters to cut your chair from a block of wood and/or plastic, based on your digital design, and then ship the resulting product to the customer's door. Ponoko sends you a cut of the sale price. The result is that you can focus on being creative and leave the messy details of marketing, manufacturing, and distribution to someone else. Ponoko is just one example the article provides of companies that are spurring the "rise of the instapreneur."
November 16, 2008
Gamers Solve Problems in Science and Computing
New Scientist reports on an expanding breed of online games that use human problem-solving skills to make progress on cutting-edge problems in science and computing. For example, the puzzles at www.fold.it require players to manipulate 3D structures to fit into the smallest possible space. The web site uses the solutions provided by users to help scientists learn about how proteins fold in the real world.
This and other examples provided in the article are examples of ways in which clever uses of distributed computing are increasingly combining software with human problem-solvers to leverage and combine the strengths of each, and to achieve results that could not have been obtain by either computers or humans individually.
October 7, 2008
Breaking the Software Development Speed Limit with Agile Programming
Science Daily reports that so-called "agile software development" can be used to slash software development time, based on the results of 68 pilot case studies of the approach. A key feature of agile programming is the rapid development and testing of prototypes, in contrast to the traditional "waterfall model" in which the entire program is developed and implemented before being tested.
September 5, 2008
"Little b," a "new computer language for modeling biological phenomen[a,] can 'think' like cells and molecular mechanisms think, thereby simulating the dynamics of biological phenomen[a]." A team of Harvard Medical School researchers created Little b to be capable of "describ[ing] biology in the same way a biologist would." According to Jeremy Gunawardena, director of the Virtual Cell Program at Harvard Medical school, "This opens the door to actually performing discovery science, to look at things like drug interactions, right on the computer."
August 24, 2008
Robot Design for the Masses
If you've ever dreamed of designing your own robot but you don't know anything about electronics, at least two options are available for you today:
- Qwerk, which Professor Illah Nourbakhsh of Carnegie Mellon University says will "democratize robot design for people intimidated by current techniques and parts"; and
- iRobot's Robot Development Kit -- the name says it all.
August 15, 2008
Two Heads (One Silicon, One Carbon) Are Better Than One
Using computers to automate inventing does not mean that humans become irrelevant. To the contrary, the most effective kinds of invention automation often involve cooperation between human and computer, a partnership in which each member does what it does best. Interactive evolutionary computation strives to take advantage of such synergies.
Louis von Ahn has developed a specialty in creating computer games which double as human-computer teams for solving problems that neither could solve by itself, such as:
- The ESP Game, which shows the same image to two people and requires them to type in a word describing it, ostensibly to read each others' minds, but also to create a text-searchable database of images (a problem which computer algorithm designers have yet to crack);
- Tag a Tune, a similar game using songs instead of images; and
- Verbosity, in which one player is shown a secret word and must provide clues from which a second player attempts to guess the secret word, all with the effect of creating a database of word meanings.
July 31, 2008
Programming Requires Technical Skill; Scratch That -- Anyone Can Do It
MIT's Media Lab has developed Scratch, a programming language designed to be easy enough for people without any technical skills to "create and share video games and animated stories." It has become particularly popular among children aged 8-15. Scratch, which is available for free download, "uses a simple set of modular building blocks that can be dragged into place and snapped together on a computer screen like Lego bricks, to create simple computer programs and animations." Over 160,000 projects created using Scratch have been upload to the Scratch web site in the year since Scratch was made publicly available.
June 11, 2006
Yet Another Kind of Sourcing
Arnold Brown, in an article in the July-August 2006 issue of The Futurist, uses the term "othersourcing" to refer to "the increasing ability to have work done not only off-site and by other entities . . . but by nonhumans," particularly robots. He provides various examples of ways in which software, robots, and other machines are being used to perform tasks that were once thought to require creativity and therefore to be the exclusive province of humans.
June 6, 2006
Inventing by and for the Masses
We've all heard of "outsourcing," "insourcing," "offshoring," "competitive sourcing," and "near-shoring." Now Wired is reporting on "The Rise of Crowdsourcing" -- the use of average people and their networked computers to create content, solve problems, and invent. The article is well worth a read.
Most relevant to this blog is the example of InnoCentive which, according to its web site, "is an exciting web-based community matching top scientists to relevant R&D challenges facing leading companies from around the globe. We provide a powerful online forum enabling major companies to reward scientific innovation through financial incentives." In other words, if a company has a technical problem that its own engineers can't crack, the company can post the problem on Innocentive's web site and award a prize to anyone -- from anywhere in the world -- who can solve it. It's bounty hunting for inventions, at $10,000-$100,000 a pop.
The Wired article goes into some detail on a particular inventor, Ed Melcarek, who has solved several problems posted on InnoCentive from the comfort of his one-bedroom aparatment in Barrie, Ontario. Why hire a team of high-priced engineers to solve a problem without any guarantee of success when you can farm out the work to garage inventors around tthe world and only pay on cash on delivery?
In the context of open source software, Eric Raymond said, "with enough eyeballs, all bugs are shallow." Perhaps now we can say the same thing about inventions.
September 22, 2005
Replacing clinical trials with computers
New Scientist reports on work performed by Richard Ho at Johnson & Johnson to test experimental diabetes drugs inexpensively and quickly by using computers to simulate the effects of such drugs on virtual patients. According to the article, IBM Business Consulting Services considers biosimulation to be a key driver of innovation in the pharmaceutical industry over the next decade.
Although this work is not related directly to automated inventing, simulators more generally are playing a critical role in such inventing, and the ability of simulators to eliminate or reduce the need for physical construction and testing in various domains is likely to continue radically reducing the resources needed to develop new products and processes.
August 11, 2005
Never send a computer to do a human's job, especially if the human works for free
Although this blog is about computer automation, humans still outshine computers in the ability to make aesthetic judgments. Despite advances in automated image processing, for example, computers still have a long way to go in recognizing the contents of a photograph or judging whether a new clothing design would be visually appealing to customers.
Interactive evolutionary computation attempts to provide the best of both worlds by combining the ability of computers to generate new designs with the ability of humans to evaluate their aesthetics. Techdirt writes about a researcher at Carnegie Mellon University who has created two online games (here and here) that are fun to play in their own right, and which use the input of the games' users to improve the ability of computers to search for and recognize digital images.
The human players of these games provide descriptive labels of images they are shown and point out key portions of such images, two tasks that computers perform poorly. The human input, however, can then be used by computers to better search for and recognize subsequent images. For example, if many human players of the first game have labeled images of elephants with the word "elephant," when someone then performs a search for "elephant" images, computer software can easily pull up the right pictures just by searching through the human-provided labels, rather than by attempting to recognize the images themselves.
Although I don't believe that either of these games uses interactive evolutionary computation, the philosophy behind both is the same: to form a partnership between computers and humans, using each for what it does best. And when the humans provide input for free, deciding whether to incorporate their superior aesthetic judgments into computer software is a no-brainer.
August 10, 2005
IPcentral ponders the difficult question of who should own the technical know-how that is inside the heads of workers at high-tech companies. The posting was motivated by a recent court ruling that temporarily bars a former Microsoft employee from performing search-related work for his new employer, Google, because doing so would violate his non-compete agreement with Microsoft.
Trade secret law and non-compete agreements have long been used to control the movement of know-how and other information stored in the heads of human scientists, engineers, and programmers. But what happens when we "bottle" such know-how, or its equivalent, in the form of software that can design machines and write software? You might think that a company that develops an improved genetic algorithm that assists it in designing new machines should maintain that algorithm as a closely-guarded trade secret. After all, isn't the algorithm the functional equivalent of an engineer's know-how within the framework of the company's business model?
But I don't think the answer is entirely obvious. Perhaps the company should seek a patent on the algorithm, thereby obtaining a period of time in which it can block competitors from using the same algorithm even if they develop it themselves independently. Or maybe they should use some combination of intellectual property protection and licensing mechanisms to secure the maximum value to the company.
The point is that transferring know-how from a human mind to software raises some tricky legal and business considerations that will need to be addressed as the automation of invention continues.
July 22, 2005
Overcoming the software bottleneck
Hardware has always improved more quickly than our ability to write software for it. Although you might think that this "software crisis," as evidenced by increasingly buggy, bloated, and insecure software, is a recent phenomenon, the term "software crisis" was used at least as early as 1972. I remember reading about it in Fred Brooks' classic The Mythical Man Month, first published in 1975.
Is this a job for genetic programming or some other automated programming technique? I don't know the answer, but when a technological "crisis" has lasted for 30+ years with no signs of abating, anything is worth a try.
More on teleportation
Although such "teleportation" (I put it in quotes because it seems more like "remote cloning" to me) would not actually automate "invention," it would help to automate manufacturing. Combine automated inventing with long-distance automated manufacturing and you start to get much closer to being able to automatically transform ideas into physical reality.
July 19, 2005
Shift seen from human coding to machine learning in the face of uncertainty
ComputerWeekly.com reports on the use of machine learning, instead of human-written algorithms, by Microsoft Research Cambridge (MRC) to solve problems in domains such as handwriting recognition. The reason, according to Christopher Bishop, MRC's Assistant Director, is that:
It is impossible to write an algorithm for recognising handwriting; many have tried but all have failed. There are too many differences even in the writing of one person to be able to write a set of rules or algorithm.
. . .
With handwriting recognition, for example, we do not try to program a solution; we give the computer lots of examples of handwriting, along with the text, and the machine learns to match the electronic image to the text.
The preceding quotes are taken from Bishop's British Computer Society Lovelace Lecture. A report of and slides from the lecture may be found here.
Much of the legal scholarship on intellectual property protection for software has focused on protection for "algorithms." In fact, many scholarly treatments have used the terms "software" and "algorithms" interchangeably. Developments such as those described by Bishop make clear that the subject matter of computer science is not limited to algorithms. The legal profession needs to wake up to this difference between the law's conception of computer science and the way that computer science works in the real world.
June 27, 2005
First File Sharing, Then People Sharing
"Teleport" is in quotes because the technology, even if it existed, wouldn't really allow you to teleport yourself, but instead would create the illusion of doing so. To "teleport" yourself to your friend's house, you would need a camera connected to the Internet, and your friend would need a special machine with a stash of special synthetic atoms. The camera would take a picture of you and transmit the picture to your friend's machine, which would assemble the synthetic atoms into your shape. Capturing and transmitting images over time would produce a moving replica of you at your friend's house.
It seems that the same technology could be used to perform remote manufacturing. The manufacturer of a machine could transmit a 3D design for a machine anywhere in the world, and have the machine manufactured on location, thereby saving the cost of transportation. Unlike in the case of you and your friend, there wouldn't even need to be a physical "original" from which to make copies, just a CAD file generated using software. And any number of copies could be made, assuming a sufficient supply of raw materials.
This would bring the basic feature that has been causing so many problems for the "copyright industry" -- namely worldwide, instant, perfect, and (essentially) costless copying of audio and video -- to the manufacture of physical machines. Although automated manufacturing based on CAD-generated designs is old news in certain fields, it is unclear what its implications would be if it were to become ubiquitous.
Biological Viruses Spread by Computer Viruses?
Drew Endy of the MIT Biological Engineering Division gave a great keynote address at GECCO 2005 today describing the work he and his team are doing. Instead of attempting to summarize the talk here, I encourage you to check out his web site to find out more about what he is doing.
In the tail end of his talk, Professor Endy discussed some of the risks of biological engineering, including the risk of lone "garage bio-hackers" engineering harmful biological substances and unleashing them on the world. As he acknowledged, this risk is not new but may be exacerbated by improvements in biological engineering.
One limitation on the risk posed by biological materials is that the harm they can cause is limited in geographic scope. Airborne anthrax can only travel so far. A computer virus, in contrast, can travel worldwide almost instantly. If (as Professor Endy described earlier in his talk) it becomes increasingly possible to engineer biological systems using abstractions that are decoupled from the underlying biological substrate, one can imagine designing a biological ("real") virus by writing a computer program. Couple this with desktop biological manufacturing and virtual "teleportion" (see my next post) and we could see the spread of biological viruses using computer viruses. The previous sentence is complete science fiction, but still interesting to ponder.
June 21, 2005
European Parliament Renames the "Software Patent Directive"
ZDNet UK has an article reporting that the European Parliament has proposed changing the name of the so-called "software patent directive" to use the term "computer-aided invention" in place of the old term, "computer-implemented invention." For those of you who can't see what difference this could possibly make, the stated reason for the proposed change is to make it clear that software is not patentable per se, but rather only as part of an innovation that uses software "to aid the performance of the invention."
These and other attempts over the years to find just the right term to define patents on software are, in my view, doomed to fail. One motivation for my "software patent puzzle" (see parts 1 and 2) is to demonstrate that there is no principled way to use the hardware/software distinction as a basis for distinguishing patentable inventions from unpatentable ones.
Instead, I propose that we focus our attention on the patentability of "computer-generated inventions." Examples of computer-generated inventions include software (a computer generates software when you program the computer) and any device whose design is generated by a genetic algorithm or other artificial creativity software.
It is the "computer-generated" feature of software that keeps causing problems for patent law, and that will continue to cause problems for patent law as computers automate the invention of things other than software. I can't justify that claim in a single blog posting, but I tried to make the basic argument here, and will continue to extend the argument in future postings.
June 17, 2005
Inventors Work Hard to be Lazy
I don't think necessity is the mother of invention - invention, in my opinion, arises directly from idleness, possibly also from laziness. To save oneself trouble.
The purpose of many inventions is to make life easier for the inventions' users. New toasters, cars, and lawn mowers make it easier for the people who use them to make toast, travel, and mow the lawn.
But the purpose of many other inventions is to make it easier for inventors to invent. An engineer might write a computer program to simulate new designs for automobile frames, thereby saving the time, money, and effort needed to build and test physical prototypes of the frames. Similarly, engineers have long invented measurement tools, ranging from calipers to electronic calculators, to make it easier to build and test their inventions more accurately.
Inventors invent such new devices to make their lives as inventors easier -- "[t]o save oneself trouble," in Christie's words. The use of such "invention-facilitating inventions" is usually transparent to the end user, who has no way to know whether his new toaster was designed by pure human ingenuity or by an automated computer program.
Invention-facilitating inventions have long been patentable. But such inventions arguably "promote the progress of useful arts" (the ultimate purpose of U.S. patent law) only indirectly, by reducing the resources (e.g., time, money, raw materials) required to invent. If this is a sufficient basis for patentability, then why not allow improvements in pure mathematics, or at least improvements in "pure software" that performs calculations more efficiently? Surely such improvements may be applied to facilitate the process of inventing.
Although I won't attempt to provide any answers to these questions here, I think that the debate over software patents is in part a debate over how direct the connection needs to be between the function performed by a computer program and some real-world ("practical" or "industrial") use for patent protection to be justified. Future improvements in automated inventing will only make resolution of this question more pressing.
June 13, 2005
What's in a name? Computer "science" vs. "engineering"
Illigal Blogging has a nice post criticizing the initial choice to use the word "science" (as in "computer science") to describe what computer people do. (I'll use the term "computer people" in this post to avoid any bias toward "science" or "engineering.") I agree with the author's assertion that the use of "science" instead of "engineering" is unfortunate because it fails to capture the significant ways in which computer scientists/engineers use computers to develop new solutions to real-world problems.
There has been similar debate about the use of the term "machinery" in the name of the computer profession's preeminent association: The Association for Computing Machinery. R.W. Hamming, in his acceptance speech for the 1968 Turing Award (ACM Digital Library subscription required), said:
At the heart of computer science lies a technological device, the computing machine. Without the machine almost all of what we do would become idle speculation, hardly different from that of the notorious Scholastics of the Middle Ages. The founders of the ACM clearly recognized that most of what we did, or were going to do, rested on this technological device, and they deliberately included the word “machinery” in the title [of the ACM]. There are those who would like to eliminate the word, in a sense to symbolically free the field from reality, but so far these efforts have failed. I do not regret the initial choice. I still believe that it is important for us to recognize that the computer, the information processing machine, is the foundation of our field.
The focus on the "scientific" and theoretical aspects of what computer people do has affected how the law has viewed computer software. For example, an argument that continues to be raised against the patenting of software to this day (and particularly strongly in Europe at the moment) is that software is "abstract" or "intangible" and therefore lacking in the "practical" or "technical" nature required for patent protection. I think that this dispute about whether software is "abstract" (and hence not patentable) or "technical" (and hence susceptible of patent protection) has some of the same roots as the debate about whether computer people are doing science, mathematics, engineering, some combination of them, or something completely different.
Although what many computer people do what is reasonably classified as "science," many are using computers to design new solutions to practical problems -- in other words, to engage in engineering. The debate over legal protection for software needs to incorporate a more nuanced understanding of what computer people do if there is to be any hope of a rational resolution to that debate.