Thursday, October 1, 2015

How Does the Economy Affect Patent Litigation?

When I was in practice, the conventional wisdom was that litigation (of all kinds) grew during recessions, because people were less optimistic and willing to let slights go, and instead fought over every dollar. Alan Marco (Chief Economist, PTO), Shawn Miller (Stanford Law Fellow), and Ted Sichelman (San Diego) have attempted to tackle this question with respect to patent litigation. They examine litigation rates from 1970-2009 in conjunction with a variety of macroeconomic factors.

Their paper is coming out in the Journal of Empirical Legal Studies, but a draft is on SSRN. The abstract follows:

Recent studies estimate that the economic impact of U.S. patent litigation may be as large as $80 billion per year and that the overall rate of U.S. patent litigation has been growing rapidly over the past twenty years. And yet, the relationship of the macroeconomy to patent litigation rates has never been studied in any rigorous fashion. This lacuna is notable given that there are two opposing theories among lawyers regarding the effect of economic downturns on patent litigation. One camp argues for a substitution theory, holding that patent litigation should increase in a downturn because potential plaintiffs have a greater incentive to exploit patent assets relative to other investments. The other camp posits a capital constraint theory that holds that the decrease in cash flow and available capital disincentivizes litigation. Analyzing quarterly patent infringement suit filing data from 1971-2009 using a time-series vector autoregression (VAR) model, we show that economic downturns have significantly affected patent litigation rates. (To aid other researchers in testing and extending our analyses, we have made our entire dataset available online.) Importantly, we find that these effects have changed over time. In particular, patent litigation has become more dependent on credit availability in a downturn. We hypothesize that such changes resulted from an increase in use of contingent-fee attorneys by patent plaintiffs and the rise of non-practicing entities (NPEs), which unlike most operating companies, generally fund their lawsuits directly from outside capital sources. Over roughly the last twenty years, we find that macroeconomic conditions have affected patent litigation in contrasting ways. Decreases in GDP (particularly economy-wide investment) are correlated with significant increases in patent litigation and countercyclical economic trends. On the other hand, increases in T-bill and real interest rates as well as increases in economy-wide financial risk are generally correlated with significant decreases in patent suits, leading to procyclical trends. Thus, the specific nature of a downturn predicts whether patent litigation rates will tend to rise or fall.
The authors also have a guest post at Patently-O discussing their findings.

I don't have too much to add to their analysis; the notion that a credit crunch will reduce litigation makes a lot of sense.

My two primary additional comments are as follows:

1. There is a lot more to the findings and the authors' analysis than presented in the Patently-O post. For example, there was a shift as litigation changed from competitor to licensor-based claims. The full paper is worth a read.

2. I am not fully convinced what this tells us about the period from 2010-2014. The authors hint that economic growth during that time correlates with a drop in litigation, but the drop in litigation was only in latter 2014 (and reversed itself in early 2015, as they note). This is further complicated by the change in how we count litigation after the America Invents Act requirement that each defendant be joined in a separate case. I think a lot more work (and creative thought) needs to be done to meld the pre- and post-AIA data into a coherent data set.

[UPDATE: I've been corrected - patent litigation by defendant count apparently decreased more than I let on (see, e.g. here) if you exclude false marking claims. This tempers some of my skepticism, though I would still like to see the post AIA data combined with pre-AIA data]

Wednesday, September 30, 2015

Trade and Tradeoffs: The Case of International Patent Exhaustion

When I read all the briefs for Lexmark v. Impression Products—the en banc Federal Circuit case on patent exhaustion that will be argued Friday—it seemed like there were pieces missing, including related to an article Daniel Hemel and I are working on. So we've written and posted a short Essay about the case, Trade and Tradeoffs: The Case of International Patent Exhaustion. If ten pages is too long, we also have an even shorter guest post up at Patently-O today, Will the Federal Circuit Recognize the U.S.–Foreign Tradeoff in Friday’s Lexmark Argument? Comments welcome!

Sunday, September 27, 2015

Supreme Court To Consider 12 Patent Petitions Monday

So far, there are zero patent cases (or other IP cases) on the Supreme Court's docket this Term. But tomorrow is the first conference since the Court's summer break, also known as the "Long Conference," at which they will consider twelve petitions in Federal Circuit patent cases. Only one of the twelve involves patentable subject matter, and I don't think the chances of the Court taking it are high. What other issues have been teed up?

The only one of the twelve to make SCOTUSblog's Petitions We're Watching page is W.L. Gore v. Bard, but I'm not sure why they're watching. The longstanding dispute over the Gore-Tex patent has now turned to an effort to overturn the longstanding rule that patent licenses may be either express or implied, but the arguments don't seem particularly compelling.

Perhaps somewhat more worth watching is Life Technologies v. Promega, which involves extraterritorial application of U.S. patent laws in a case where LifeTech was found to have actively induced its own foreign subsidiary. The case has strong advocates (Carter Phillips for Life Technologies and Seth Waxman for Promega), and the petition is supported by amici Agilent Technologies and Professor Tim Holbrook, and by a dissent below from Chief Judge Prost.

There are two petitions related to whether the recent Supreme Court § 285 decisions (Octane Fitness and Highmark) also changed the standard for willful infringement under § 284: Halo v. Pulse and Stryker v. Zimmer. As Jason Rantanen noted at Patently-O, Judge Taranto's concurrence from the denial of rehearing en banc in Halo explained that this is not the right case, but that some § 284 issues could warrant en banc review in a future case. I think the Supreme Court might give the Federal Circuit time to work this out.

I/P Engine v. AOL questions whether the Federal Circuit's de facto standard of review in obviousness cases (including implementation of KSR's "common sense" approach) is insufficiently deferential to factual findings. The Federal Circuit's obviousness holding knocked out a $30 million jury verdict (over a dissent by Judge Chen), and the petition is supported by the Boston Patent Law Association and i4i. But this doesn't look like a winner to me: obviousness is a mixed question of fact and law; the Federal Circuit has always articulated what seems like the right standard of review; and it's hard to say the Federal Circuit has vigorously embraced KSR (see, e.g., the end of this post).

None of these seem like must-takes, but we'll see! Grant decisions will likely be released later in the week.

Thursday, September 24, 2015

The Difficulty of Measuring the Impact of Patent Law on Innovation

I'm teaching an international and comparative patent law seminar this fall, and I had my students read pages 80–84 of my Patent Experimentalism article to give them a sense of the difficulty evaluating any country's change in patent policy. For example, although there is often a correlation between increased patent protection and increased R&D spending, it could be that the R&D causes the patent changes (such as through lobbying by R&D-intensive industries), rather than vice versa. There is also the problem that patent law has transjurisdictional effects: increasing patent protection in one country will have little effect if firms were already innovating for the global market, meaning that studies of a patent law change will tend to understate the policy's impact.

It is thus interesting that some studies have found significant effects from increasing a country's patent protection. One example I quote is Shih-tse Lo's Strengthening Intellectual Property Rights: Experience from the 1986 Taiwanese Patent Reforms (non-paywalled draft here). In 1986, Taiwan extended the scope of patent protection and improved patent performance. Lo argues that this change was plausibly exogenous (i.e., externally driven) because they were caused by pressure from the United States rather than domestic lobbying, and he concludes that the strengthening of patent protection caused an increase in R&D intensity in Taiwan.

One of my students, Tai-Jan Huang, made a terrific observation about Lo's paper, which he has given me permission to share: "My first intuition when I see the finding of the article is that the increase of R&D expenses may have something to do with the tax credits for R&D expenses rather than stronger patent protection." He noted that in 1984, Taiwan introduced an R&D tax credit through Article 34-1 of the Investment Incentives Act, which he translated from here:
If the reported R&D expenses by manufacturing industry exceeds the annual highest spending on R&D in the last five years, 20% of the exceeding expenses could be used for tax credit for income tax. The total tax credit used could not exceed the 50% of annual income tax, but the unused tax credit could defer to next five years.
Additional revisions were made in 1987, related to a tax credit for corporations that invest in technology companies, which might indirectly lead to an increase in R&D spending by tech companies. As I've argued (along with Daniel Hemel) in Beyond the Patents–Prizes Debate, R&D tax credits are a very important innovation incentive, and Lo doesn't seem to have accounted for these changes in the tax code. Yet another addition to the depressingly long list of reasons it is hard to measure the impact of patent laws on innovation!

Friday, September 18, 2015

The Availability Heuristic and IP

I'm reading (or more accurately, listening to) Thinking, Fast and Slow, by Daniel Kahneman. The book is an outstanding survey of the psychological literature on how we form judgments and take shortcuts in our mental thinking. The number of studies of highly trained statisticians who make basic statistical errors in everyday tasks is remarkable.

The book is a must read, I think, for scholars of all types. Not only does it provide a variety of food for thought on how to thing about forming judgments from research, the informal book style allows Kahneman to take a meta-view in which he can describe problems of reproducible results and intractable debates in his own field (which, not surprisingly, ring true in IP research as well).

I'll have a couple of posts on this topic in the coming weeks, but the first relates to the availability heuristic. This mental shortcut usually manifests itself by giving greater weight, importance, or perceived frequency to events that are more "available" to the memory - that are more easily conjured by the mind. You usually see this trotted out in debates about the relative safety of air versus car travel (people remember big plane crashes, but way more people die in car accidents. I've also seen this raised in gun control debates, as more children die in swimming pools than accidental gunshots (especially if you consider the denominator number of pools v. number of guns). But pools are a silent killer. (Note that I make no statement on regulation - perhaps pools are underregulated; insurance companies seem to act as if they are).

Thursday, September 10, 2015

More Evidence on Patent Citations and Measuring Value

For years, researchers have used patent citations as a way to measure various aspects of the innovative ecosystem. They have been linked to value, information diffusion, and technological importance, among other things. Most studies find that more "forward citations" - that is, the more future patents that cite back to a patent - means more of all these things: more value, more diffusion, and more importance.

But forward citations are not without their warts. For example, both my longitudinal study of highly litigious NPEs and random patent litigants and Allison, Lemley & Scwhartz's cross-sectional study of all patent cases filed in 2008-2009 found that forward citations had no statistically significant impact on patent validity determinations. Additionally, Abrams, et al., found that actual licensing revenue followed an inverted "U" shape with respect to forward citations (Lisa writes about that paper here). That is, revenue grew as citations grew, but after a peak, revenues began to fall as forward citations grew even larger. This implies that the types of things we can measure with forward citations may be limited by just how many there are, and also by the particular thing we are trying to measure.

This is why it was so great to see a new NBER paper in my SSRN feed yesterday (it's not totally new - for those who can't get NBER papers, a draft was available about a year ago). The paper, by Petra Moser (NYU), Joerg Ohmstedt (Booz & Co.) & Paul W. Rhode (UNC) is called Patent Citations and the Size of the Inventive Step - Evidence from Hybrid Corn. The abstract follows: 
Patents are the main source of data on innovation, but there are persistent concerns that patents may be a noisy and biased measure. An important challenge arises from unobservable variation in the size of the inventive step that is covered by a patent. The count of later patents that cite a patent as relevant prior art – so called forward citations – have become the standard measure to control for such variation. Citations may, however, also be a noisy and biased measure for the size of the inventive step. To address this issue, this paper examines field trial data for patented improvements in hybrid corn. Field trials report objective measures for improvements in hybrid corn, which we use to quantify the size of the inventive step. These data show a robust correlation between citations and improvements in yields, as the bottom line measure for improvements in hybrid corn. This correlation is robust to alternative measures for improvements in hybrid corn, and a broad range of other tests. We also investigate the process, by which patents generate citations. This analysis reveals that hybrids that serve as an input for genetically-related follow-on inventions are more likely to receive self-citations (by the same firm), which suggests that self-citations are a good predictor for follow-on invention.
I love this study because it ties something not only measurable, but objective, to the forward citations. This is something that can't really be done with litigation and licensing studies, both of which have a variety selection effects that limit the random (shall we say, objective) nature of them. More on this after the jump.

Tuesday, September 8, 2015

Laura Pedraza-Fariña on the Sociology of the Federal Circuit

The Federal Circuit has faced no shortage of criticism in its role as the expert patent court, including frequent Supreme Court reversals and calls for abolition of its exclusive patent jurisdiction (most prominently from Seventh Circuit Chief Judge Diana Wood, though she was far from the first). In Understanding the Federal Circuit: An Expert Community Approach, Laura Pedraza-Fariña (Northwestern Law) argues that the sociology literature on "expert communities" helps explain the Federal Circuit's "puzzling behaviors."

She suggests that "[t]he drive that expert communities exhibit for maximal control and autonomy of their knowledge base . . . explains why the Federal Circuit is less likely to defer to solutions proposed by other expert communities, such as the PTO," as well as "to defy non-expert superior generalists, such as the Supreme Court." Expert communities also engage in codification of their domains to demonstrate their expertise, manage internal dissent, and constrain subordinate communities, and Pedraza-Fariña argues that this tendency explains the Federal Circuit's frequent preference for rules over standards. (As she notes, this is related to Peter Lee's argument that the Federal Circuit adopts formalistic rules to limit the extent to which generalist judges must grapple with complex technologies.) Finally, expert communities seek to frame borderline problems as within their area of control, and to place inadequate weight on competing considerations outside their expertise—qualities that critics might also pin on the Federal Circuit.

Friday, September 4, 2015

Nothing is Patentable

I signed onto two amicus briefs last week, both related to the tightening noose of patentable subject matter. Those familiar with my article Everything is Patentable will know that I generally favor looser subject matter restrictions in favor of stronger patentability restrictions. That ship sailed, however; apparently we can't get our "stronger patentability restrictions" ducks in a row, and so we use subject matter as a coarse filter. It may surprise some to hear that I can generally live with that as a policy matter; for the most part, rejected patents have been terrible patents.

But, now that these weaker patents are falling like dominoes, I wonder whether subject matter rhetoric can stop itself. This has always been my concern more than any other: the notion of unpatentable subjects is fine, but actually defining a rule (or even a standard) that can be applied consistently is impossible.

This leads us to the amicus briefs. The first is in Sequenom, where the inventors discovered that a) fetal DNA might be in maternal blood, and b) the way you find it is to amplify paternal fetal DNA in the blood. The problem is that the discovery is "natural" and people already knew how to amplify DNA. As Dennis Crouch notes, this seems like a straightforward application of Mayo - a non-inventive application of the natural phenomenon. Kevin Noonan and Adam Mossoff were counsel of record on the brief.

But here's the thing: it's all in the way you abstract it. Every solution is non-inventive once you know the natural processes behind it. This argument is at the heart of a short essay I am publishing in the Florida L. Rev. Forum called Nothing is Patentable. In that essay, I show that many of our greatest inventions are actually rather simple applications of a natural phenomenon or abstract idea. As such, they would be unpatentable today, even though many of them survived subject matter challenges in their own day.

Returning to Sequenom, there were other ways to parse the natural phenomenon. For example, it is natural that there is fetal DNA in the mother's blood, but finding it by seeking out only the paternal DNA is a non-conventional application of that phenomenon. No one else was doing that. Or, it is natural that there is fetal DNA in the mother, but finding it within the blood is a non-conventional application of that phenomenon. After all, no one had been doing it before, and no one had thought to do it before. Either of these two views is different than the types of application in Mayo v. Prometheus, which simply involved giving a drug and then measuring the level of the drug in the system (something you would expect to find after giving the drug). In Mayo, the court commented on the bitter divide over what to do about diagnostics, and punted for another day. That day has come.

The second amicus brief is in Intellectual Ventures v. Symantec; Jay Kesan filed this brief. In the Symantec case, the district court ruled that unique hashes to identify files were like license plates, and therefore conventional. Further, it noted that the unique ids could be created by pencil and paper, given enough time. It distinguished virus signatures (an example in PTO guidance of something that is patentable) by saying that file ids were not really computer based, while virus signatures were. I mention this case in my Nothing is Patentable essay as well.

I have less to say about this ruling, but I think it is wrong on both counts. First, unique file id hashes are much more like virus signatures than they are like license plates. There is a rich computer science literature area in this field - solving problems by identifying files through codes associated with their content. Of course, computer science folks will say this is not patentable because it's just math. That's a different debate; but it is surely not the same thing as attaching a license plate to a car. Second, this notion that people can do it with a pencil and paper has got to go. As the brief points out, with enough people and enough time, you can simulate a microprocessor. But that can't be how we judge whether a microprocessor can be patented, can it?

These two cases show the pendulum swinging - and hard - toward a very restrictive view of patentability. Taken seriously and aggressively applied, they stand for the proposition that many of the fruits of current R&D are outside the patent system -- even though their historical analogues were patentable. Perhaps I'm being a pessimist; I sure hope so.

Friday, August 28, 2015

Dow v. NOVA: Maybe Nautilus Does Matter

In June 2014, the Supreme Court held in Nautilus v. Biosig that the Federal Circuit's "insolubly ambiguous" test for indefiniteness was "more amorphous than the statutory definiteness requirement allows," and that the proper test is whether the claims "fail to inform, with reasonable certainty, those skilled in the art about the scope of the invention." But is this actually a stricter test?

Jason Rantanen (Iowa Law) posted a nice essay this spring, Teva, Nautilus, and Change Without Change (forthcoming Stan. Tech. L. Rev.), arguing that in practice, the answer has been no: "The Federal Circuit continues to routinely reject indefiniteness challenges . . . . Indeed, with one exception, the Federal Circuit has not held a single claim indefiniteness under the Nautilus standard, and even that one exception would almost certainly have been indefinite [pre-Nautilus]." (Since then, the court also held the Teva v. Sandoz claims indefinite, but it had done the same pre-Nautilus.) Rantanen also noted that the Federal Circuit has failed to grapple with the meaning of Nautilus and has continued to rely on its pre-Nautilus cases when evaluating definiteness. In one case the court even reversed an decision that claims were indefinite for reconsideration after Nautilus—implying that the Nautilus standard might be less stringent! (I've noticed the Federal Circuit similarly undermine the Supreme Court's change to the law of obviousness in KSR.)

But the Federal Circuit's decision today in Dow Chemical Co. v. NOVA Chemicals Corp. carefully examines the change Nautilus has wrought. Dow's asserted claims cover an improved plastic with "a slope of strain hardening coefficient greater than or equal to 1.3," and NOVA argued that the patents fail to teach a person of ordinary skill how to measure the "slope of strain hardening." In a prior appeal (after a jury trial), the Federal Circuit had held the claims not indefinite under pre-Nautilus precedent. The district court then held a bench trial on supplemental damages, leading to the present appeal. In today's opinion by Judge Dyk, the Federal Circuit holds that Nautilus's change in law "provides an exception to the doctrine of law of the case or issue preclusion," and holds that the claims are indefinite under the new standard.

The Federal Circuit dismisses the hand-wringing over whether Nautilus really meant anything, stating that "there can be no serious question that Nautilus changed the law of indefiniteness." The court notes that "Nautilus emphasizes 'the definiteness requirement's public-notice function,'" and that "the patent and prosecution history must disclose a single known approach or establish that, where multiple known approaches exist, a person having ordinary skill in the art would know which approach to select. . . . Thus, contrary to our earlier approach, under Nautilus, '[t]he claims . . . must provide objective boundaries for those of skill in the art.'"

Examining the claims at issue, the court notes that the patents state that "FIG. 1 shows the various stages of the stress/strain curve used to calculate the slope of strain hardening," but the patents contain no figure showing the stress/strain curve. There were four ways to measure the slope, which could result in different results, but the patents provided no "guidance as to which method should be used or even whether the possible universe of methods is limited to these four methods." The claims thus fail the new test: "Before Nautilus, a claim was not indefinite if someone skilled in the art could arrive at a method and practice that method," but "this is no longer sufficient."

Tuesday, August 25, 2015

Evaluating Patent Markets

I've been interested in patent markets for some time. In addition to several articles studying NPE litigation, I've written two articles discussing secondary markets explicitly: Patent Portfolios as Securities and Licensing Acquired Patents.

Thus, I was very interested in Michael Burstein's (Cardozo) draft article on the subject, called Patent Markets: A Framework for Evaluation, which is now on SSRN and forthcoming in the Arizona State L.J.

What I like about the approach of this article is that it takes a step back from the question of whether certain types of parties create a market, and asks instead, is having a market at all a good thing?

Here is the abstract:
Patents have become financial assets, in both practice and theory. A nascent market for patents routinely produces headline-grabbing transactions in patent portfolios, and patent assertion entities frequently defend themselves as sources of liquidity essential for a patent market to function. Much of the discourse surrounding these developments assumes that a robust, liquid market for patents would improve the operation of the patent system. In this Essay, I challenge that assumption and systematically assess the cases for and against patent markets. I do so by taking seriously both the underlying innovation promotion goal of the patent system and the lessons of financial economics, and asking what might be the effects of a market for patents that looked roughly like other familiar markets for stocks, real estate, or secondhand goods.

I conclude that, like much in patent law, the effects of robust patent markets are likely to vary with specific technological and business contexts. When there is a close fit between patents and useful technologies, a patent market can support a market for technology that aids in connecting inventors with developers and sources of capital for commercialization. But when that fit breaks down, market pricing could favor litigation over commercialization. Similarly, a liquid patent market might help to allocate the risks of innovation and of patent infringement to the parties best able to bear it, but a kind of moral hazard familiar to the market for subprime mortgages could lead not to more innovation but to more patents, thereby increasing the overall risk in the system. This analysis suggests that we are having the wrong conversation about patent markets. Rather than assuming their utility and asking how to improve them, we should be undertaking empirical research to determine the circumstances in which they will or will not work and exercising caution in invoking the logic of markets in policy debates about the contours of the patent system.
Like other markets, they are good when they are good, and bad when they are bad. Burstein adds a lot of nuance throughout the article, focusing on arguments why markets may be good or not, but without making too many assumptions about any particular technology or patent owner type.

One thing I would add to the article is the importance of timing. Markets early might be better than markets later, even in the same technological contexts. The article would probably put this into the "business context" category, but I think the importance of diffusion, cumulative innovation, and path dependency merit a separate consideration.

In all events, I think the essay adds to the literature and may produce some testable hypotheses as well.

Saturday, August 1, 2015

Some Reflections on Localism, Innovation, and Jim Bessen's New Book "Learning by Doing"

I highly recommend Jim Bessen's new book, Learning by Doing: The Real Connection Between Innovation, Wages and Wealth (2015), published by Yale University Press. I was lucky to present alongside Jim at Yale's first Beyond IP Conference, where he discussed his ideas about the importance of education and worker training for a successful innovation economy. As Bessen puts it in his book,
innovation can suffer from two distinct problems: markets can fail to provide strong incentives to invest in R&D, and they can fail to provide strong incentives for learning new skills. Underinvestment in R&D is not the only problem affecting innovation. It might not even be the most important problem. ... There is simply no justification for focusing innovation policy exclusively on remedying underinvestment in R&D, especially since most firms report that patents, which are supposed to correct this underinvestment, are relatively unimportant for obtaining profits on their innovations.
The takeaway is that protecting inventions with patents and copyrights can't be the sole function of an effective innovation policy. Governments need to focus on a much broader range of policies to "encourage broad-based learning of new technical skills, including vocational education, government procurement, employment law, trade secrecy, and patents."

At IP Scholars in Chicago this year, I'll be presenting my new paper Patent Nationally, Innovate Locally.  Like Bessen, I will talk about a broad range of innovation incentives that focus on research and technology commercialization, as well as public investments in STEM education, worker training, and public infrastructure. I'll argue, however, that when intellectual property rights are not the chosen mechanism, many of these incentives should come from sub-national governments like states and cities because they are the smallest jurisdictions that internalize the immediate economic impacts of public investments in innovation.*  While states cannot internalize the benefits of patent and copyright regimes that result in widespread disclosure of easily transferable information, they can internalize the benefits of innovation finance (direct expenditures of taxpayer revenues on innovation) especially when those expenditures go towards improving the education, skills, and knowledge-base of the local labor force.

Innovation finance (IF) is an important new frontier in IP law scholarship. Not only does innovation finance supplement federal IP rights by correcting market failures in technology commercialization and alleviating some of the inefficiencies created by patents and copyrights, it also takes into account Bessen's point: "markets can fail to provide strong incentives to invest in R&D, and they can fail to provide strong incentives for learning new skills." Both market failures are important, and the latter may be even more important than the former. But if we really want to focus on a broader range of policies like government procurement and support for public education to "encourage broad-based learning of new technical skills," as Bessen suggests, then we need to start looking at state and local governments.

To understand this point, take the example of a government prize for developing a better way to manufacture cars without using as many resources (e.g. 3D printing). If the federal government gives the prize, this makes some sense: assuming the prize hits its mark, national taxpayers will eventually benefit when the innovation is perfected and widely adopted, and the information on how to do it becomes public. But the impacts of the prize are going to be very different for different parts of the country. First off, the prize winner has to locate its research and operations somewhere. Presumably, it's going to choose a state like Michigan or Ohio with the resources, facilities, and human knowledge-base to do this kind of research and experimentation. The immediate benefits for local firms and residents are obvious: jobs, tax revenues, business for local companies. There is also a less perceptible but far more important benefit: easier access to new technical knowledge coming out of the experiments and inside information on emerging market developments. Plentiful research suggests that a lot of knowledge is hard to transfer and that effective exchange requires proximity, especially when science-based research and unfamiliar technology are involved. The implication for local officials seeking to boost the regional economy is clear: the more innovation that happens in your jurisdiction and the more residents who gain skills in an important new field, the better off your state or city will be. (This is the basis for innovation cluster theory and the idea that regions gain competitive advantages from localized knowledge exchange, originally discussed by UC Berkeley's AnnaLee Saxenian.)

Given that the immediate economic impacts of the 3D printing prize, including the tax revenues and most of the spillovers, are geographically localized to certain regions, do we really want federal policymakers designing these types of incentives, and do we really want taxpayers in states like Alaska and Arizona footing the bill? Or do we want significant input –  both political and financial – from the places in which the innovation is occurring? I think the answer is the latter. The benefits of decentralizing fiscal policy are numerous. I see at least two major benefits in this case: fairer shouldering of tax burdens, and more efficient innovation policies as a result of the better information and stronger incentives of local officials. Not only are they aware of the capabilities and needs of the local economy but they can act swiftly in response to local problems, liberated from the wrangling of "earmark politics" at the national level. The same principles apply to education and incentives for learning new skills – the second prong of Bessen's revitalized innovation policy. For example, would we expect national policymakers, who act in the national interest and are beholden to federal taxpayers, to supply the right amount of vocational training for future workers in the newly invented 3D printing automobile industry of my hypothetical? No: we would expect the main push for this kind of training to come from a state like Michigan with the right mix of interested workers and industry players.

In short, I suggest that innovation policy in the United States is not federal. It is bifurcated: the federal government protects exclusive rights in new inventions and original expression using patents and copyrights; states, cities and sub-national governments use innovation finance to capture the geographically localized economic benefits of innovation.

There are several responses to my argument. If innovation finance were all local, then wouldn't there be a major under-supply of research, especially for innovations without a clear market, like research into rare debilitating diseases or (until Elon Musk) space exploration? Wouldn't states compete with each other and end up spending way too much to attract firms into their jurisdictions? Aren't local politicians vulnerable to capture by local industries? I agree that all these risks exist. This is why I discuss a variety of instances where the federal government has an important role to play. Besides protecting copyrights and patents in new inventions, the federal government does a lot of direct financing for innovation too. This money goes towards education, basic research, and mission R&D (mainly in national defense) – all of which produce pervasive national spillovers as well as localized ones. On the flip side, the federal government also has a variety of means for controlling and coordinating the actions of sub-national governments in order to reduce corruption, wasted expenditures and "beggar thy neighbor" competition. Some of these preemptive forces come from discretionary judicial doctrines like the Dormant Commerce Clause (admittedly a weak source of limits on states); others are or perhaps should be statutory (the Patent Act??).

If you have comments or seek a draft of Patent Nationally, Innovate Locally, or my other working paper, Cluster Competition, which argues that the federal government is trying to "manage" state competition to grow innovation clusters through the America Competes Act's regional innovation program, please email me at: or

* The basic principle of fiscal decentralization is "the presumption that the provision of public services should be located at the lowest level of government encompassing, in a spatial sense, the relevant benefits and costs." 

Thursday, July 30, 2015

Kiesling & Silberg on Incentives for Rooftop Solar

I've written about innovation policy experimentation and about incentives beyond IP, so I was interested in a new working paper posted by Lynne Kiesling and Mark Silberg, Regulation, Innovation, and Experimentation: The Case of Residential Rooftop Solar. They are not lawyers, but their description of incentives for the development and commercialization of rooftop solar will be of interest to legal scholars of innovation, as it underscores that the role of the state is far more complex than simply providing IP incentives. (Indeed, the paper never mentions IP.)

These incentives include a 30% federal tax credit (set to expire at the end of 2016), as well as many state-level incentives, such as volumentrically reduced subsidies to benefit first movers, net metering policies requiring credits to consumers who produce excess energy, and financial regulations that allow third-party financing to help consumers avoid upfront capital expenses. As they note, "the details matter," and "[n]ot all renewable portfolio standards are equal." This paper seems to nicely encapsulate many of those details.

Monday, July 27, 2015

Rachel Sachs & Becky Eisenberg on Incentives for Diagnostic Tests

I highly recommend two recently posted articles on declining innovation incentives for diagnostic tests, particularly due to changes in patentable subject matter doctrine. In Innovation Law and Policy: Preserving the Future of Personalized Medicine, Rachel Sachs (Petrie-Flom Fellow at Harvard Law) examines the intersection of IP with FDA regulation and health law, joining a growing body of scholarship that seeks to contextualize IP in a broader economic context. Here is the abstract:
Personalized medicine is the future of health care, and as such incentives for innovation in personalized technologies have rightly received attention from judges, policymakers, and legal scholars. Yet their attention too often focuses on only one area of law, to the exclusion of other areas that may have an equal or greater effect on real-world conditions. And because patent law, FDA regulation, and health law work together to affect incentives for innovation, they must be considered jointly. This Article will examine these systems together in the area of diagnostic tests, an aspect of personalized medicine which has seen recent developments in all three systems. Over the last five years, the FDA, Congress, Federal Circuit, and Supreme Court have dealt three separate blows to incentives for innovation in diagnostic tests: they have made it more expensive to develop diagnostics, made it more difficult to obtain and enforce patents on them, and reduced the amount innovators can expect to recoup in the market. Each of these changes may have had a marginal effect on its own, but when considered together, the system has likely gone too far in disincentivizing desperately needed innovation in diagnostic technologies. Fortunately, just as each legal system has contributed to the problem, each system can also be used to solve it. This Article suggests specific legal interventions that can be used to restore an appropriate balance in incentives to innovate in diagnostic technologies.
Diagnostics Need Not Apply is a new essay by Rebecca Eisenberg (UMich Law) that was nicely summed up by Nicholson Price: "let's just admit it - diagnostic tests are unpatentable."
Diagnostic testing helps caregivers and patients understand a patient’s condition, predict future outcomes, select appropriate treatments, and determine whether treatment is working. Improvements in diagnostic testing are essential to bring about the long-heralded promise of personalized medicine. Yet it seems increasingly clear that most important advances in this type of medical technology lie outside the boundaries of patent-eligible subject matter.
The clarity of this conclusion has been obscured by ambiguity in the recent decisions of the Supreme Court concerning patent eligibility. Since its 2010 decision in Bilski v. Kappos, the Court has followed a discipline of limiting judicial exclusions from the statutory categories of patentable subject matter to a finite list repeatedly articulated in the Court’s own prior decisions for “laws of nature, physical phenomena, and abstract ideas,” while declining to embrace other judicial exclusions that were never expressed in Supreme Court opinions. The result has been a series of decisions that, while upending a quarter century of lower court decisions and administrative practice, purport to be a straightforward application of ordinary principles of stare decisis. As the implications of these decisions are worked out, the Court’s robust understanding of the exclusions for laws of nature and abstract ideas seems to leave little room for patent protection for diagnostics.
This essay reviews recent decisions on patent-eligibility from the Supreme Court and the Federal Circuit to demonstrate the obstacles to patenting diagnostic methods under emerging law. Although the courts have used different analytical approaches in recent cases, the bottom line is consistent: diagnostic applications are not patent eligible. I then consider what the absence of patents might mean for the future of innovation in diagnostic testing.
As I have written, I think changes to patentable subject matter doctrine are an important problem for medical innovation, and that policymakers should think seriously about whether additional non-patent innovation incentives are needed in this area.

Thursday, July 23, 2015

The Latest on Biosimilars: The Federal Circuit Holds that the "Patent Dance" Is Optional

In a previous post, I discussed a district court decision holding that the process for resolving patent disputes under the Biologics Price Competition and Innovation Act (BPCIA) is optional. That post contains extensive background on the BPCIA and its purpose of providing an abbreviated pathway for “biosimilar” drugs to get to market and compete with their branded analogs, resulting in lower prices for consumers. The bottom line is that, under the BPCIA, makers of biosimilar products can rely on the clinical trial data developed for the branded (or “reference”) product in order to accelerate FDA approval. Nevertheless, the BPCIA provides 12 years of data exclusivity to the manufacturer of the reference product. And beyond that period, even if the biosimilar garners FDA approval, the brand owner can try to continue to keep it out of the market by asserting claims of patent infringement. The BPCIA provides for a procedure involving pre-suit information exchange between the brand and biosimilar makers—the so-called “patent dance”—that is intended to apprise the brand of the biosimilar’s manufacturing process and narrow down the number of patents to be be asserted. But the district court, and now the Federal Circuit on appeal, have held that the biosimilar can lawfully refuse to participate in the patent dance.

Wednesday, July 22, 2015

Several Empirical Studies on Injunctions Post-eBay

Chris Seaman recently released a draft of his new paper, Permanent Injunctions in Patent Litigation After eBay: An Empirical Study. In the paper, he present the results of his empirical study of contested permanent injunction decisions in district courts for a 7½ year period following eBay (May 2006 to December 2013). This post follows up my previous posts on Seaman's WIPIP presentation and on Ryan Holte's paper assessing the effects of eBay. Kirti Gupta and Jay Kesan also just released their own study on eBay's impact.