Wednesday, November 30, 2011

Innovation Waits for No One!




By Keith McDowell

“Now is not the time!” Have you ever been given that line? Even better, did you know that the classical concept of “now” that we all live by was forever destroyed by Einstein in 1905? That’s right! Two events that are simultaneous or occur at the same time in your personal reference frame occur at different times for someone who is speeding by you in their car. Wow! Talk about getting a person to church on time as famously crooned and celebrated in My Fair Lady. Obviously, the incremental difference in “now” is so tiny that we don’t notice it – unless, of course, you’ve purchased for a Christmas present the latest in a near “light speed” automobile.

But then “time waits for no one.” Or does it? Can we slow down time by speeding away toward a distant galaxy? Well yes – relative to the reference frame of Earth, but it won’t change your personal perspective on the “passage of time.” And then there are those folks who race through life, hoping to slow down time and catch a few more moments. Good luck on that approach!

What about innovation? Does it wait for us to “find the time” or does it wait for no one? I think the latter. And therein lies a problem for America. We’ve created an innovation ecosystem with moving parts or processes that waste time checking for conformity to accepted norms or established patterns of behavior. It’s an authoritarian-gatekeeper system guaranteed for the most part to replicate the norm and produce “me too” research and incremental innovation. We like to pretend that it is an “open system” where discoveries and innovations constantly bubble up to the surface of our conscious “now” – a system where the best and the brightest quickly reach the frontiers of the creative mind through independent research and innovation. But the reality is often different. And at the heart of the problem on the discovery end of the creation pipeline is one of society’s and academe’s oldest control mechanisms: peer review.

What is “peer review” and how does it affect the innovation ecosystem? “Peer review” is a simple concept. The notion is that one’s performance, whether as an individual or a collection of individuals, should be evaluated by one’s peers. It’s a practice carried out routinely in our legal system using a jury of “peers.” In academe, the practice takes many forms including principally the following activities:

  • Refereed publications
  • Grantsmanship
  • Tenure and promotion
  • Post-tenure review
  • Program review

To the consternation of many of our best researchers, these activities have grown over the past decade or two to the point that they have pushed aside the time needed to think creatively and be innovative. As Daniel J. Meyer stated succinctly in an article from The Chronicle of Higher Education: “It’s getting impossible to produce my own work I’m spending so much time assessing others!” He further states that “I have many comrades (not ‘in arms’ yet, but it is coming) who are experiencing an unbearable overload of review duties. … Draconian measures, you say? Perhaps. But maybe this is a Drago we should embrace. If not, we are going (to) [sic] take an ailing peer-review system and kill it outright.”

These are strong sentiments and they are shared by many including the author. But can we document the reality in the hope of finding a remedy? It’s tough to do. For example, let’s examine post-tenure review.

Post-tenure review came into vogue in the late 1990s as an accountability or audit tool to satisfy politicians and legislators that someone was looking over the shoulder of tenured faculty members to make sure they continued to be productive following tenure. Typically, on a timescale of five to eight years, tenured faculty members prepare a massive dossier documenting their performance including student teaching evaluations. Often, external letters are solicited. Depending on the review, corrective actions might be taken including a change in teaching load, a reduction in research space, or a host of other such actions. It’s the tenure process redux. And in most institutions, the data gathering is now formalized through the maintenance of a yearly faculty activity report. Woe unto the faculty member who doesn’t login and update his or her data profile in a timely manner!

The demand for such performance data and accountability has become a battle cry for some elements of the right-wing conservative movement in America. The O’Donnell brouhaha in Texas comes to mind in that regard. But let’s be clear. While I support post-tenure review and the use of the faculty annual report, they represent a new element for the innovation ecosystem and they consume time – lot’s of it.

And then there are program reviews. Once again, the accountability and audit mentality dictated that university programs should be reviewed on a regular basis with a cycle time of five to eight years. Massive reports are created and external reviewers are conscripted – usually with the bribe of a stipend – to pass judgment on a program or department. Based on such data analysis, the Texas Higher Education Coordinating Board has determined that a number of “underperforming” physics programs should be shut down in Texas. Hmm, that should be a real motivator for poor and disadvantaged STEM students in those affected areas! Have we just turned off the next Michael Dell? Has their concept of “now” turned into “yesterday”?

Aside from the recent appearance of post-tenure review and program review, we’ve had in place since before World War II the process of reviewing research and scholarly manuscripts as a means to generate “refereed” publications. I’ve spoken to that issue previously. But what are the hard numbers? In Figure 1, I display the growth in the number of science and engineering publications using recently published data from the National Science Foundation. Over the twenty-year period from 1988 to 2008, the number of such publications nearly doubled and likely has now passed the one million mark per year. That’s a lot of papers to review for the science and engineering community!

Figure 1

With respect to grantsmanship and the peer review of proposals, the data appear to show some measure of saturation over the past decade. Using data taken from the annual Merit Review Reports to the National Science Board, I display in Figure 2 the number of externally reviewed proposals along with the number of distinct reviewers per year. Interestingly, the two numbers are approximately the same – one proposal per reviewer! One might argue that the past decade has shown a crossover in number of proposals versus number of distinct reviewers, but it will take another decade to prove this assertion, if true.

Figure 2

The number of distinct reviews for the same time period is shown in Figure 3. Again, not much growth has occurred and there are fluctuations.

Figure 3

A detailed examination of the NSB reports seems to indicate that there is a small trend toward fewer reviews per proposal. Based on these hard data, one cannot conclude that peer review of proposals has significantly increased as a burden over the past decade. Instead, it appears to be a saturated situation. But it still consumes time and is based on proposed research, not performance. I’ve addressed that issue and its effect on innovation elsewhere.

While peer review is firmly ingrained in the American innovation ecosystem, it’s time to understand how we use it and whether it truly is the wisest course of action as we enter the era of global competition.  Now is the time for America to come to terms with peer review, lest our competitors move faster and push our “now” into their “yesterday.” Innovation waits for no one.

Tuesday, November 15, 2011

EXCUSE ME! Your Microscope Is Out of Focus


By Keith McDowell

So how many hours a day do you spend on social networking? Do you tweet, text, or use email services? How many computers, tablets, cellphones, and other wireless devices do you own? Have you recently updated your Facebook and LinkedIn profiles or added new “friends” and connections? Let’s face it folks. Social networking in all its various forms is an exploding new phenomenon, rapidly penetrating all levels of society and creating new channels of rapid communication. Does anyone doubt that the movement “Occupy Wall Street” or the occurrence of “flash mobs” would exist without social networking?

But is “social networking” something that should be studied, researched, and understood through funding by the National Science Foundation (NSF)? Or should such research be assigned a “low priority” having little or no benefit for society and America. How about understanding terrorist crowdsourcing and other cyber threats to national security played out using social networking? Are they not important subjects to be understood?

And then there is the ever present fruit fly – a real irritant to social conservatives and those who see waste in the federal funding of research. Does anyone really care whether the design of fruit fly genitalia affects their ability to “hook up” and copulate? Of course, it’s not a topic that keeps me awake at night, absent a fruit fly infestation in my home. But I respect the judgment of experts in the field that such research is important.  As Forrest Gump said about a box of chocolates: “You never know what you’re gonna get.”

Let’s be clear. Picking winners or losers in advance in the game of discovery and innovation is mostly a waste of time. It’s not an issue of defunding “whimsical” research, whatever that is. And who determines the winners in advance? What are the criteria? Would you have picked social networking as a multi-billion dollar industry before the fact?

Unfortunately, some people in America choose to take legitimate concerns about what research should be funded, what metrics should be used, whether the processes currently used are appropriate and sufficient, whether waste and fraud are rampant or not, and what is America’s strategic endgame and attempt to use the scientific illiteracy of many Americans coupled with extreme and often counterfactual social conservatism to achieve political gains at the expense of discovery and innovation in America. Such an exercise was recently conducted by Senator Tom A. Coburn in his report of April 2011 entitled The National Science Foundation: Under the Microscope.

Americans including me support a balanced budget. But is doing “more with less” to the point of starvation a realistic and appropriate goal for our country, especially in the era of global competition? Let’s examine the facts and the history that led to the Coburn Report.

Recognizing that America gives every indication of falling behind in the game of competition in the global marketplace including the loss of jobs, Congress passed the America COMPETES Act (Public Law 110-69) calling for a doubling of NSF funding over seven years. Passage of the Act was the culmination of many studies including Rising Above the Gathering Storm and a clarion call from leadership in nearly all segments of American society. But according to the Coburn Report, the “dramatic increase in spending passed with little debate or dissent.”

The report further challenges whether increasing the NSF budget “to bolster our economy” is a magic bullet. Instead, the report purports to document widespread fraud, waste and abuse of the taxpayer dollar through funding of wasteful and controversial projects of limited scientific benefit, excessive amounts of expired funds, inadequate contracting practices, lack of accountability metrics, excessive funding of conference and related travel, duplicative funding with other government agencies, inappropriate behaviors, and lack of transformative research, to name some of the report’s assertions. These are serious charges and they must be taken seriously and dealt with appropriately independent of one’s political persuasion or the underlying belief system and principles that support the characterization and interpretation of the facts in the Coburn Report.

While I applaud Senator Coburn for engaging the debate, I strongly and emphatically disagree with the both the specifics and the intent of the report’s recommendations. Implementation of the recommendations as structured will play only at the margins and will assuredly dampen both discovery and innovation in America. But we have some common ground! From my perspective, Coburn and his staff put the NSF under a microscope that was out of focus. So let’s review the recommendations of the Coburn Report and bring those recommendations properly into focus.

Establish Clear Guidelines for What Constitutes “Transformative” and “Potentially Transformative” Science: Good luck with such guidelines! Picking winners before the fact of becoming transformative is a useless exercise. I repeat: would you have chosen social networking as a winner? I put little stock in those who say they know it when they see it before the actual outcome. Discovery and innovation are mostly serendipitous exercises where the accumulation of sweat equity through the funding of putative non-transformative research and even “whimsical” research is essential. Making this argument is not to say that we should not have targeted research. Grand challenge research must be an essential part of the portfolio of funding and our Nation’s discovery and innovation strategic plan. But we should not eliminate or throttle exploratory research and innovation based on political or personal bias. Furthermore, the implied threat to be transformative, creative, innovative, or ELSE never works. The user-friendly mantra of Go Forth and Innovate! is the appropriate strategy.

Set Clear Metrics to Measure Success and Standards to Ensure Accountability: The STAR METRICS program is a worthy federal attempt to achieve this desired outcome and is supported by the Coburn Report and by the university community. Accountability has always been an integral part of the federal funding process. But one must remember that discoveries and innovations are not part of a programmed assembly line easily amenable to accounting and audit in the traditional sense. The debate as to what constitutes appropriate metrics for both research and innovation is ongoing and lively. It is by no means a settled matter as emphasis on the commercialization of university research and the need for a growth in American jobs dominates the discussion.

Eliminate NSF’s Social, Behavorial, and Economics (SBE) Directorate: Simple response: emphatically NO! We live in a world dominated by convergence and network science where What Is Easy Has Been Done. Transformative discovery and innovation will occur at the boundaries and overlap of the physical, biological, and social dimensions of our universe. Enough said on this recommendation!

Consolidate the Directorate for Education & Human Resources: With at least 100 STEM education programs and maybe as many as 200 spread across numerous federal agencies, we have a problem that needs immediate attention. I defer the reader to Go Forth and Innovate! for a full and complete discussion of this recommendation from my perspective. Suffice it to say that I agree with the Coburn Report that we must come to terms with which federal agency should take the lead in funding the STEM education agenda for America.

Use it or Lose It: NSF Should Better Manage Resources It Can No Longer Spend or Does Not Need and Immediately Return $1.7 Billion of Unspent, Expired Funds It Currently Holds: The Coburn Report represents that “[a]pproximately 47 percent of the 151,000 final and annual project reports required in the past 5 years were submitted late or not at all.” Furthermore, “The agency’s record of failing to place an emphasis on closing out expired grants and returning unused funds to the United States Treasury raises question [sic] about the overall fiscal management of the agency.” The Coburn Report concludes that “grant oversight remains as an ongoing management challenge at NSF.” I agree! There is no excuse for failing to file a final report and reprobates and their institutions should be punished in some manner. But to adopt the rather simplistic characterization of this issue as taken by the Coburn Report is not the answer. As one who has managed multi-million dollar grants, I can assure the public that fiscal management of grant dollars is a challenge complicated by personnel timeline management, academic schedules and a plethora of other complex factors such as on time delivery of needed and purchased equipment and the effect on completing the research project. NSF must have the ability to be flexible in this regard and to carry over unspent funds. The notion that NSF has $1.7 billion in available funds is naïve at best.

Reduce Duplication: Develop a Strategic Plan to Streamline Federal Research and Development: I agree in principle. We need a national debate about how we fund R&D and innovation in order to form a better strategic plan. However, duplication in and off itself is chump change in the larger arena and should not be the dominant factor.

Provide the NSF Inspector General Additional Resources and Place a Greater Emphasis on the Office of Inspector General’s Findings: From my experience, NSF and academe place a great deal of emphasis on the OIG’s findings. Some would argue we place an obsessive emphasis on them. Indeed, over the past decade or two, a vast bureaucracy has grown up to deal with the growth of federal rules and regulations and their interpretation. It has become a hyper-technical world where subtle nuances of the meaning of words make a difference. Do we really want further government intrusion into the business of federal funding of research? At some point, a proper cost-benefit balance must be struck. I submit we’ve already reached and perhaps surpassed that point. Further growth in the “accountability culture” will only stifle discovery and innovation and not achieve the desired end result. In that sense, I support the “deregulation” platform propounded in the political arena. But if more “resources” are to be poured into the OIG, I have a simple request. Hire the best and the brightest at a competitive salary. Far too often, university professionals sit across the table from OIG auditors and inspectors who would do well auditing Walmart but know almost nothing about the complexities of the federal funding of research.

Senator Coburn, sharpen the focus on your microscope and take out the fuzziness caused by political and personal bias. Starving discovery and innovation in America because of perceived and even real issues at the National Science Foundation are not the answer. Nor is highlighting and listing research programs that don’t fit your worldview. It’s time to move past an obsession and annoyance with fruit fly research and join with those of all persuasions to forge and craft a new strategic plan for R&D and innovation in America. 

Wednesday, November 9, 2011

Are We Too Pooped To Pop?


By Keith McDowell

Do you need one of the modern energy pick-me-up concoctions to make it through the day? Do rolling power outages from a record number of days with temperatures over 100 degrees interrupt your lifestyle? How about the drain on your cellphone battery from too much texting, tweating, emailing, and gaming? Is the high price of gasoline driving you to consider purchase of a hybrid or an electric car? How about global warming? Are you one of the many people suffering adverse consequences from the effects of severe weather events? If so, join the crowd! It’s a world gone crazy as we find ourselves “too pooped to pop” and “too old to stroll” as Chuck Berry famously crooned.

But wait! Are we really running out of readily available energy or does it just seem that way at times? Have we become so dependent on rapid access to energy through our high-technology gadgets and our American lifestyle that the slightest interruption portends an energy crisis? Are the lights really dimming in America, the incandescent bulb controversy notwithstanding?

Like many, I thought for many years that we had a looming energy crisis, both in the production of electricity as available electrons and the production of fuels in the form of gasoline, home heating oil, and natural gas. I was certain America was in trouble, not because of dysfunctional government, energy policy, or lack of political will, but from fundamental scientific issues and some basic facts.

To begin with, fossil fuels have a limited lifetime of perhaps a century – quibbling about the exact time frame is a stupid exercise, although fracking and other new discoveries and techniques help to extend their contribution as an energy source. But the book Out of Gas by David Goodstein and the existence of Hubbert’s Peak convinced me that the rate of fossil fuel consumption would eventually pass its rate of production. And the issues of human-driven global warming and adverse climate change as by-products of the use of fossil fuels along with the concomitant increase in carbon dioxide emissions were real show-stoppers for me. America needed to find a way to slow the use of fossil fuels.

The short-term solution appeared to be nuclear power, although three-mile island, Chernobyl, and now the Fukushima incidents demonstrate that real and long-term issues exist for the industry – Mother Nature being the strongest protagonist through earthquakes, tsunamis, and the gradual leakage of toxic and radioactive wastes into the biosphere. But as a physicist, I accepted these risks and knew that they could be mitigated.

Hydropower, geothermal, and biomass alternatives didn’t compute in either the short- or the long-term for me for many reasons including scale and basic economic considerations. Although wind power continues to demonstrate its efficacy as an alternative source for electrons, it will never answer the full projected demand curve for electrons nor solve the need for fuels. It’s not the ultimate or even the short-term solution.

The long-term solution seemed simple to me, but scientifically and technologically challenging. We needed fusion power! We needed sexy, high-technology physics projects like the National Ignition Facility to unlock the secrets of Mother Nature and to turn on the Sun right here on Planet Earth. Advancing our understanding of fusion power would culminate in a limitless energy source to do almost anything we wanted to do. And it would push forward the frontiers of science to boot! The scientist in me was thrilled at the prospects of yet another triumph of humankind over nature. But then I had my epiphany! That moment when you realize how stupid you’ve been and that the solution has been in front of you all along. It’s called solar energy!

My personal journey to the realization that solar energy is the solution began in the period of 1979 and 1980. After reading the 1979 American Physical Society study entitled Solar Photovoltaic Energy Conversion by H. Ehrenreich – I’m one of the select few who actually read the document from cover to cover – and listening to the discourse at the time about photovoltaics, I became convinced that environmental issues surrounding the large-scale mining of the exotic metals needed to produce photovoltaic devices, the exorbitant costs, and the “low technology” flavor of solar panels mitigated against solar energy as a solution.

But then I read the wonderful article by George Johnson in the National Geographic for September 2009 entitled Plugging into the Sun that summarizes the current status of the solar-power industry around the world. Two important factors finally dawned on me. First, the flux of solar photons onto the Planet Earth is enormous. Depending on one’s favorite metaphor, the Earth receives “more energy in one hour than the world used in one year” in 2002. [Wikipedia] Translation: We already have a fusion source of energy that provides a nearly unlimited supply of free, convertible photons. Second, the technology needed to convert photons into electrons or electricity already exists and the market forces are rapidly making the production and sale of solar panels for homes or buildings cost competitive. Furthermore, we already know how to construct and utilize giant solar power plants. Translation: solar power is not a scientific or technological challenge.

But this begs the question: why is America not jumping with both feet onto the solar energy bandwagon? We want energy independence. Why not solar, especially given that it is a “solved” solution? Thomas L. Friedman in Hot, Flat, and Crowded has opined more broadly on the issue advocating for market forces to drive innovation across the spectrum on possible energy sources including solar. Even as I write this article, an email has arrived in my inbox with a Paul Krugman editorial entitled Here Comes the Sun in support of solar energy and pointing out the rapid acceleration of the sector off most people’s radar screens – the Solyndra story notwithstanding. So what are the pitfalls for solar energy and what about other alternative energy sources both for the production of electricity and for fuels?

One unspoken issue is that we might converge too quickly onto a specific solar energy industry using inferior technology. It’s both the “low tech” issue and, more importantly, the sunk cost issue that plagues the nuclear power industry. Many believe that the prevalent reactor design was chosen too quickly and is sub-optimal. Is the same thing happening to the solar industry?

And then there are all the usual technological and economic issues including the location and viability of American transmission lines – we need a smart grid, energy storage during off hours, distributed generation using local solar panels versus large-scale solar collection plants, and the associated costs of building out the infrastructure. But these are old issues and we have to deal with them independent of choosing solar energy. Indeed, we do that every day. They are not show-stoppers for solar energy.

How about environmental issues? No one should be fooled. There is no such thing as “clean” or “green” energy, alternative or not! Every source of energy carries a burden whether in the production of the materials – think mining – used for the infrastructure or the process itself – consider the environmental issues of solar plants in a pristine desert. Even algae, a potential source for fuels and carbon feedstock, must be “fed” by phosphate salts – or the equivalent – taken from the earth. In the end, it is a trade-off.

And what about the issue of solar energy only producing electrons, not fuels? In my opinion, this is the true insertion point for innovation. What would we do with a large excess of electrons? Would we “flare” them off as we often do with natural gas? Or would we design ancillary systems to absorb and use them in creative ways? With enough electrons, one can convert lots of different materials into fuels. If we no longer need coal to produce electricity, we can extend its lifetime as a feedstock for fuels using electrons as the energy source for the conversion. Even better, we can invent new technologies for converting biomass into feedstock for the chemical and plastics industry instead of the ultimately futile game of converting biomass into fuel. And we can slow down the use of fossil fuels, thereby improving our biosphere and reducing global warming. Yes, we need more innovation through research and development including even studies on fusion energy.

But do we have an energy crisis founded on scientific issues or basic natural facts as I originally thought? Are we “too pooped to pop?” Emphatically, NO! What we have is the lack of societal and political will to use and turn the levers and knobs available to us to effect the transformation to a solar economy. Mercifully, as suggested by Krugman and others, there are positive signs that change is afoot. I’m not an expert on the subject, but federal loan guarantees seem to be working in California. Tax incentives for the installation of home solar panels are essential. We should pursue any and all avenues to stimulate the solar transformation. In short, what we need is a well-constructed “Solar Electrification Program” similar in spirit to the original Rural Electrification program used during the Great Depression. We need to support the Solar Energy Industries Association as a counterpoint to government. We need to understand that solar equals jobs and jobs now! With a clear understanding of where we are at in the space of solar power and the bigger space of the energy crunch, we can innovate and replace the current growth in short-term solar installation and construction jobs with longer-term high technology jobs.

Are we too pooped to pop? I think not. It’s time for America to “pop” and go solar.

Tuesday, November 1, 2011

Free Agency: It's a Bad Idea!


By Keith McDowell

President Obama’s Council on Jobs and Competitiveness released an interim report in October entitled Taking Action, Building Confidence.  The recommendations and actions proposed in the report comport with conventional wisdom as to what the Nation needs to be doing to accelerate the growth of jobs in America, especially through innovation and the creation of small startup companies. And due diligence is given to the role played by the commercialization of university research through entrepreneurship and technology transfer and the need to enhance such activities. All would be well with the report from my perspective in terms of the recommendations for universities except for one thing: they got it wrong!

I suppose “getting it wrong” can be rationalized given the makeup of the Council. There is not a single representative from academe other than the Broad Institute at Harvard and MIT and no one knowledgeable in the practical aspects of the commercialization of university research serves on the Council. 

So what is it that they got wrong? In Initiative 2: Nurture the High-Growth Enterprises That Create New Jobs, one of their recommendations on page 22 is to “allow university faculty to shop discoveries to any technology transfer office.” It’s also referred to on page 21 as an “open-source” approach. Originally proposed by the Ewing Marion Kauffman Foundation as the “Free-Agent” or sometimes the “Free-Choice” model, it has been strongly, emphatically, and universally rejected by the university community as a very bad idea with many flaws that will not improve the commercialization of university research, but have the opposite effect: slow it down! At a seminal meeting in February of 2010 hosted by the Council on Governmental Relations in Washington, I presented the arguments against the “Free Choice” model and my speech and powerpoint presentation are available. The Association of University Technology Managers and others have taken equally strong positions against this model.

But what could possibly be wrong with “open sourcing,” “free agency,” or “free choice?” It sounds like motherhood, apple pie, and the American way. Absent other information, it is easy to become trapped in these rhetorical flourishes and trapped by the claims of the Kauffman Foundation and their supporters at the Department of Commerce that they have “studies” demonstrating the need for such a model. Here’s the reality: there are no such credible studies! Show me the credible studies and I’ll happily review them.  What we have are only collected hearsay and the whisperings of disgruntled innovators and entrepreneurs who had their bad experience. No credence is taken of the vast majority of commercialization activities successfully conducted every day by competent people all across the spectrum of American universities. No credence is taken of the input and rejection of this model by university experts with decades of experience. No credence is given to the rapid expansion and transformation underway at universities in the commercialization of their research.

But let’s be specific for a moment and list some of the principal arguments against the “free agent” model. Additional information and further clarification can be found in the COGR material.

  • The concept didn’t work before Bayh-Dole.
  • The concept hasn’t worked internationally.
  • The concept hasn’t worked well in joint university experiments and some have dropped combined operations.
  • The approach will significantly slow down commercialization due to a) the complexity of multiple inventors and technologies, b) the complexity of funding sources for most research, 3) the complexity of having multiple managers across different universities, 4) the balkanization of faculty IP, and 5) the tangled legal obligations with concomitant legal and financial liabilities.
  • Faculty conflict of interest through financial interest in license and startups.
  • Emphasis on personal benefit to faculty members over societal benefit
  • Lack of practical experience in commercialization by faculty and their lack of available time to pursue a long-distance relationship.
  • Problems with faculty conflict of commitment with respect to outside agencies.
  • The concept ignores the investment of a faculty member’s university in the inventions.
  • There is no mechanism to bear the cost of free-agent commercialization – inventors don’t have the money and the home institution isn’t going to pay someone else the full cost.
  • Home institutions will be reluctant to risk money on IP managed by another institution.
  • Potential emergence of third party licensing entities requiring profit or sustainability, thereby driving up the costs.
  • “Cherry picking” of IP.
  • Problems with one university committing incredibly tight resources to another – not going to happen!
  • State law: Texas requires fair value for IP.
  • No evidence the model would improve more “modest” operations.
  • Technology transfer offices (TTO) believe it to be an inappropriate allocation of their resources and an inappropriate relationship.
  • Harm to the faculty-TTO relationship.

And these arguments are just the start of the problems for the free-agent model! In short, pursuing this model will waste a lot of valuable time and resources trying to “fix” all the problems with it. And guess what! Even if it is made to work, it will have a minimum to negligible effect on the speed or quality of the commercialization of university research to the marketplace. There will not be an explosion of discoveries or inventions suddenly being brought forth. There will be no explosion of new jobs! Do we really want to do this to ourselves?

Even at its best, the commercialization of university research is a “contact sport” requiring the building of relationships among many stakeholders. Those relationships take time and don’t work well as a long-distance marriage. Disputes and disagreements within such relationships should not become fodder for promoting a concept that is doomed to failure from the outset. 

But most insidious to me in the “free-agent” model is its failure to understand, promote, and accelerate the single most important step that we need to take in America: the creation of communities of innovation. We should not be running to MIT, Stanford, or some other presumed bastion of the all mighty to enhance commercialization – my apologies to my good friends at MIT and Stanford. We should be building regional innovation ecosystems similar to the ones springing up all around America using best practices from our peers such as MIT or Stanford. That means doing the hard work of improving our local university technology transfer and commercialization infrastructure as I’ve advocated throughout my articles and as other leaders, such as the Association of University Research Parks and AUTM, have advocated. There are many extant great ideas for improving commercialization such as “proof-of-concept” funding and innovation centers. Free-agency as recommended in the Council’s interim report is a bad idea and a waste of time and resources. I strongly urge the President’s Council on Jobs and Competitiveness to remove this recommendation and replace it with ones that will have the desired outcome: jobs and prosperity for Americans.