Thursday, June 30, 2011

Gotcha! Technology Transfer Complaints


By Keith McDowell

The game of Gotcha! presents a challenge to many politicians and public figures when it comes to simple facts about geography or American history – the former Governor of Alaska being the most notable. But the Gotcha! game is not the only one being regularly practiced in America today. Among many such games, we also have the “gate” game, the blame game, the whining complaint game, and my favorite, the THEM game of science fiction origin. No matter the game, it’s always someone else’s fault when reality doesn’t match false expectations or doesn’t match cleverly invented altered realities. Of course, facts don’t matter! Even worse, facts become contextualized and interpreted to form the altered reality.

Society thrives on this primordial need to invent THEM, the nether world of beings and entities, whether mythical or real. It’s an evolutionary behavior pattern built into the collective psyche of humankind to win the competitive advantage. It’s the will to survive!

But believe it or not, one of the favorite targets of such behavior in the modern innovation age is the university technology transfer office (TTO) or office of technology commercialization. It’s so easy to target university TTOs. They don’t have the time or resources to respond. They are a bureaucracy populated by amorphous and faceless people. And it makes for good blood-sport and good press in the modern age of undoing enlightenment.

Whether of evolutionary origin or not – and at a more mundane level, complaining is a fact of life and it behooves those who are the target of the complaints to understand what drives the complaints and whether some form of action is required. Whether valid or not, complaints inform the perceptions that many people hold toward university TTOs to the point that they are now affecting American policy and legislation. President Obama continues to focus with laser precision on the role played by universities in innovation – the recent manufacturing initiative being notable.

There are many university leaders who choose to ignore the complaints and adopt the strategy of waiting for the tempest to dissipate. Or at best, they repeat scripted mantras and expect that somebody, somewhere, will “fix the problem.” I personally believe both responses to be bad strategy and not reflective of leadership in an era of global competition and the need for innovation. But to be fair, most of our nation’s university leadership is so busy fighting the budget battle that technology transfer is always number eleven on their top ten list of things to be dealt with. Furthermore, it’s difficult to address an issue without even a proper accounting or list of the complaints.

So, at the risk of furthering the Gotcha! game, here is a list of the top complaints unranked and not prioritized that I’ve collected over the past decade. A more complete discussion can be found in Go Forth and Innovate! It’s a long list so don’t give up!

  • Universities are too slow in processing deals.
  • No one is empowered at the university to make the final decision and approve a deal.
  • The negotiating skill of university dealmakers is poor.
  • University IP is overvalued and the royalty rates are too high.
  • Public universities should give away IP since they are funded by taxes.
  • We paid for the research, so why should we pay for the IP?
  • What do you mean we can’t pre-value the IP from sponsored research?
  • Marketing of available IP is not adequate.
  • Patenting of university inventions hurts society as faculty are restricted in their ability to publish.
  • Scouts should “walk the halls” for us to find all the undiscovered commercializable technologies.
  • Stop throwing early technology “over the transom.”
  • We want “One Stop Shopping.”
  • Bundling of university patents helps patent trolls.
  • Universities themselves are patent trolls.
  • University greed upfront chokes startups.
  • Universities don’t share in the risk of startups.
  • Universities should use a single-template deal structure.
  • The nightmare scenario of sponsored research going to a competitor happens.
  • Universities don’t think globally.
  • Work with us, or we offshore our research!
  • University research facilities should be made readily available to businesses.
  • Universities need to act more like a business.
  • Universities are too bureaucratic.
  • Licensing revenues are not commensurate with the level of research expenditures, so someone is mismanaging the university IP.
  • Bayh-Dole is bad for the commercialization of university research.
  • Publication of innovations by faculty before IP protection creates problems downstream.
  • Publications are wasteful.

Did you find your favorite complaint on the list? Unraveling the truth or falsity of these complaints and putting them into their proper context is a significant and worthy task – one that many have undertaken including AUTM and the present author. Constructive criticism, even if cast as a complaint, is a healthy exercise and part of how society functions. But we must remember that the commercialization of university research is in a transformational phase with everyone rapidly adjusting and adapting. The reality is that technology transfer and the commercialization of university research happens every day at universities with little fanfare and almost no angst other than that present with any deal-making experience. I salute those who toil at such tasks as we celebrate the Fourth of July. They are not THEM!

Thursday, June 23, 2011

Brother, Can You Spare Me a Dime?


By Keith McDowell

The bane of all individual independent researchers is the moment when one must put pen to paper to frame the next grant proposal from an ill-formed idea. Independent of the quality of idea – whether innovative, transformational, or really dumb, the first step for the researcher is always the same: send money! We’re all familiar with this terse communication from our children away at college. Nobel Laureate Richard Smalley, one of our nation’s leading experts in nanotechnology before his untimely death, was also fond of pitching this message at the end of his energy talks. He, of course, provided his lab address as the location to “send money.”

But why is the message “send money” the next step when a new idea emerges? No one doubts the aphorism that innovation through research and development requires funding. And no one doubts that grantsmanship by faculty members is the key determinant for tenure and promotion, often independent of the quality of the research or its relevance for the American innovation ecosystem. But why is “new” money needed to pursue a “new” idea. Why not use “old” money already in the system? The answer is surprisingly simple. The American system for funding R&D in our universities has become so accountability laden and so driven by a defined format that innovation through transformational and frontier research has been choked almost out of existence. You can’t use “old” money on a “new” idea.

While an excellent case can be made that the present system of funding R&D served America well in the Twentieth Century, there are a growing list of problems equally as important as “old versus new money.” The time to re-examine that system has arrived, especially with regard to STEM university researchers. The argument that it is the best because it’s the best and we shouldn’t change it is a tautology and isn’t sufficient for the Twenty-First Century. But what form should a new system of funding take? Who should be funded and by what process? Should it be use-directed research founded on societal challenges or sandbox science?

Taking pen to paper, I have a proposal to offer to those unafraid of change. The core principle is that individual university researchers must be enabled through base-level, but minimal funding to explore essentially random or self-determined pathways or research directions – whether basic research, applied research, or development – in an independent manner without pre-approval. Such activity is at the heart of transformational research and disruptive innovations. In some small measure we currently satisfy this principle by providing new faculty hires with startup funding. But that funding rapidly gets spent and is not replaced by additional sandbox funding. I would greatly expand upon this beginning with a total revamping of individual investigator funding in the United States and base the structure on people, not specific research ideas.

To accomplish that end, my system would function in the following manner – beginning with a newly minted doctoral graduate. A university announces an opening for a tenure-track position at the rank of assistant professor and the hiring process follows the normal course of business including a description of the intended research focus. The hiring process serves not only to vet candidates for the faculty position, but to vet the candidates for funding. The new hire receives research funding from a startup package as well as a federal individual investigator grant or IIG. In other words, if they made it to the point of being hired, you fund them!

The IIG would likely be funded by a block grant to the university obtained by a competitive process or some other process that achieved an appropriate distribution of IIG funds across all universities. Geographic distribution is both politically wise and essential for the growth of regional innovation ecosystems. The new hire would be guaranteed IIG funding for a specified period – likely five to seven years – with an annual rate of say, $150,000, including summer salary support. The funds could be restricted from hiring graduate students who would be funded by a separate federal or local fellowship program – an additional mechanism to spur independent thought and the training of independent investigators. Graduate students would also be funded by larger-scale grants such as center grants or lablet projects.

At the conclusion of the specified time period, the faculty member would undergo evaluation for tenure and/or promotion using the normal process with the added feature that renewal of the IIG award for another specified period of time would also be considered. Thus, at both the initiation of the IIG and its renewal, a peer review process is used. For the renewal of the IIG award, it would be a peer post-review of the accomplishments from the first award. It is a performance-based system.

The IIG funding with an appropriate cycle time and peer post-review process would continue throughout the career of the faculty member. Change of university, retirement, and other such issues are easily worked out. It would also make sense for superstars to get additional IIG funding, but only as part of the peer post-review process. Keep in mind that IIG functions as part of a larger funding system including center grants.

So, what are the benefits of my IIG funding system? Here is an unranked list.

·      The game of “science by proxy” is significantly reduced. Faculty return to performing research instead of being consumed by grant writing.
·      Independent investigator driven research is supported that permits immediate funding of new pathways.
·      Both young and old investigators with a new idea receive base-level funding without an approval process. The Establishment and “me too” thinking doesn’t rule the day.
·      Performance as a researcher instead of the ability to write successful proposals would drive base-level research funding.
·      Potentially innovative and disruptive research receives funding.
·      Peer review is appropriately maintained, but as a post-review, performance-based exercise. You are rewarded for what you do instead of what you say you are going to do. Performance-based peer review is much more of a merit-based system than one based on the merit of proposed research ideas – especially ideas chosen to satisfy the Establishment!
·      The massive and exponentially growing process of individual investigator (or small team) proposal review is replaced by a process that merges base-level grantsmanship with the tenure and promotion process. It would require some modification of the tenure and promotion process to factor in the cycle time of the IIG. Changing to this system of reviewing people instead of proposals would significantly increase the time spent in the laboratory instead of time spent on peer review and the funding bureaucracy.
·      Peer review would be placed where it belongs – principally in the hands of fellow faculty members and with input from external reviewers. Fidelity of the process could be assured through federal audit of the records since federal funding is involved.
·      The combined peer post-review process would increase the likelihood that fellow faculty members would become more familiar with their local peers. This would also be abetted by my open publication system.
·      The base-level funding would cover basic, applied, and developmental research – especially if faculty members are rewarded for commercialization of their research in the tenure and promotion process. This goal would be enhanced by the structure of the other parts of the overall grants system.
·      The financial burden on universities for startup grants would likely be reduced, although universities could still use startup grants for competitive advantage in hiring as an add-on to the IIG.
·      The development of regional innovation ecosystems could be enhanced depending on the distribution algorithm for IIG awards to universities.

How radical is my performance-based IIG? It’s not a jobs program – as some have described the present grants system – but a performance-based program to spur innovations with a sustainable, predictable funding model that puts researchers back to work doing independent, exploratory research – not bureaucracy. If you truly want to see a radical program, consider the proposal by Robert J. Birgeneau and Frank D. Yeary from The Washington Post of 27 September 2009 entitled A New Model to Help Finance Higher Education Birgeneau and Yeary, chancellor and vice chancellor of the University of California at Berkeley, propose that “great public research and teaching universities receive basic operating support from the federal government and their respective state governments.” Their scheme would be “a 21st-century version of the Morrill Act.” They further state “As with any daring scheme, the devil is in the details. … Yet such problems are solvable, if there is a will. … Simply put, no matter what the form, we must take some radical steps if we are to preserve the public character of America’s great public universities.”

The IIG system is potentially a simple version or small piece of the Birgeneau-Yeary model, but with real and positive consequences for the innovation enterprise. University researchers and innovators with a new idea should not be reduced to depression-era beggars chanting “brother, can you spare me a dime?”

Friday, June 17, 2011

Innovation Thwarted: The Publication Bottleneck


By Keith McDowell

Are you one of those Americans who believe that healing in a specific individual can occur through intervention by the prayers of a large body of people intent on that fixed goal? Many believe in the power of mass prayer, but most consider it a matter of faith and “not science.” And if you were the fictional character, Katherine Solomon, in Dan Brown’s The Lost Symbol, where and how would you publish your earth-shattering new discoveries in noetic science, whether real or imagined?


But let’s be more specific. Do you believe that a positive personal attitude and a strong desire to live can overcome cancer or put it into remission – a variant of the placebo effect? How about other diseases or bodily ailments? And if you believe this, how would you prove it and could you get your findings published? Are we still in the realm of “not science?”

Even more specific, do you believe in reprogramming the human brain by making use of brain plasticity to overcome bodily ailments? Oops, wait a minute! Brain plasticity is “real” science and a proven fact. The brain can be rewired. But how far and to what purpose? Where is the dividing line between “science” and “not science?” Even more important for innovation, who decides what and how “scientific” discoveries and new knowledge are published. Who gets access to that published knowledge – no matter how important or funky in the eyes of the beholder?

Amazingly, most people believe that the American research and development (R&D) publication system is perfectly fine, although innovation pundits often bemoan the putative decline recently in the percentage of publications by American authors. Indeed, why would people think otherwise? By all accounts and measures the publication system has been enormously successful in making America the world R&D leader and the center of innovation for decades.

But the reality in my opinion and that of many others is that the American R&D publication system so successful in the twentieth century is a dinosaur that is ponderously slow, built on a business model from the era of vinyl records, very expensive to the point of breaking the budgets of university libraries, mostly inaccessible to anyone not a paying member of the R&D club, and an enormous waste of time and effort for researchers. In short, it sucks and it chokes innovation! But how did we get to this state and what can we do about it? And is there a new paradigm available to us for the twenty-first century?

First, a comment about digital electronic media versus the print medium is in order. No one doubts that society as we know it has been transformed in the past few decades to a new digital and information age. No one doubts that the print medium is rapidly fading into a niche market with newspapers going online and ebooks outselling hard-cover books. Although slow to embrace the change, the research publication system also “went digital” with online journals and CDs arriving periodically in the snail mail. In essence the same business model was used, but in a digital format.

Of course, cracks appeared in the model as experiments such as ArXiv at Cornell were undertaken. The biggest crack came from various attempts to create systems of “open access” to research publications such as the PLOS family of journals or the requirement by Congress that publications based on research funded by the National Institutes of Health be posted and freely available following an embargo period. The notion is that federally-funded research should be accessible to anyone, not just those who can afford to pay the heavy price for journals or who have access to journals as a member of a club, taken to mean universities or businesses.

All such endeavors to reform the publication system are worthy, but are effectively incremental in nature – not transformative. They speak mostly to the issue of access in the digital age. But to approach a true transformation of the publication system, one must understand the full scope of how the publication system works. And we begin by asking the question: why do we have research journals? Here are my thoughts with some analysis as to how each feature plays out in the digital age.

Communication: It goes without saying that researchers and scholars want to “publish” their work and print journals were historically the principal method, although books also played a major role as well as newspapers. Today, we have blogs, tweets, social media, cable TV, “open laboratory notebooks,” and a growing list of means to communicate.

Community: Historically, print journals permitted the formation of discipline-driven communities or tribes and indirectly produced a de facto partitioned search process to find specific material. If a scholar wanted to understand the “physics” of a particular process, you searched “physics journals” for relevant information. Unfortunately, the explosion of research publications and the breakdown of silos driven by convergence and trans-disciplinary modern research have made the concept of a “journal” problematic.

Claim: You can’t win the Nobel Prize or win the intellectual property (IP) royalty sweepstakes without staking a claim to your discoveries, findings, or data. Publication is the essential method of choice, taking account of the requirements of the patent process. Of course, there are issues with who owns the IP – taken broadly to include ideas and discoveries – as well as proprietary issues, trade secret issues, national security or classified material issues, and so forth.

Standards: Quality is an essential factor, whether in the scientific methodologies used or the presentation of the material.

Prestige: Journals pride themselves on being the best of breed, no matter that the “rules” for establishing the pecking order are arcane. Authors crow about having their articles published in the “best of the best.” Tenure and promotion committees make a fetish out of counting articles in prestige journals. So what has all this chest beating and hoopla have to do with the publication system? In my opinion, it’s at the core of why we haven’t transformed the publication system. It’s a final vestige of power by the good ole boy network of “clubs.”

Archive: The print medium cataloged by journal has been the longstanding method of archiving publications over the span of generations and even centuries. But in the past few decades, the sheer volume of journal space has overwhelmed libraries and led to the creation of tightly compacted, offsite storage centers.

Authentication: Ultimately, journals through the process of peer review of articles or at the whim of the editor serve as gatekeepers to keep “not science” or “bad science” from being published. Similar arguments are made in the liberal arts and other non-science fields with the qualifiers “not scholarly” or “bad scholarship.” Realistically, the peer-review system only weakly accomplishes these goals and instead has become an enormous sinkhole of time and effort by researchers taking precious time away from being innovators.

Evaluation: Quite frankly, the journal peer review system has become a major tool for evaluating the performance of researchers and scholars – particularly in the university tenure and promotion system. It’s a surrogate system permitting tenure and promotion committee members to default to the judgment of journal peer reviewers and thereby avoid the task of actually reading their colleagues scholarly works. Who can blame them given the job description for the modern faculty member?

Armed with this background on the current journal-driven publication system, we ask: must we use the centuries-old publisher model because the print medium was traditionally the only way and they have the resources to do it? Of course not! And I’m not talking about simply “going digital.” I’m talking about a radical transformation. Here is my proposal for an “open publication and access system” taken from the perspective of an author. I use herein the phrase “online journal” lightly recognizing that the concept is most likely going to disappear in the future and be replaced by an “online publishing service.” I also ignore the cost issue for the moment.

Credentialing and authentication of author(s) and reviewer(s): All people desiring to publish articles in an online journal or serve as reviewers should go through a credentialing process. There are many ways that credentialing can be accomplished similar to what we already do on the Internet to figure out whether someone is who they say they are. Professional societies or universities could easily set up a credentialing process. I don’t envision a system with any real differences from the one we already have in place that permits someone to publish in a journal. Once credentialed with an online journal, the author or reviewer would be provided with an authentication system that permits login to the journal.

Prepare electronic article: The essential issue for the manuscript is quality through conformity to style and format. Universities and scholarly societies should get together our best minds on style, punctuation, and whatever else we need – including figures, tables, and graphics – and set the rules for the format of all research articles, allowing for several possible styles. There is no need to persecute authors with over 2,000 varieties of endnote styles just so each proprietary journal can have its own “feel.”

Validate standard format: Once we produce a set of standards for style and format, universities – meaning their faculty members – agree to abide by the standards and demand that all publishing adopt the standards. We don’t have to play by someone else’s rules. Guess what! New startup businesses could emerge such as internet “editing” companies that check conformity to the standard for a small fee. Quality, as measured by conformity to simple publishing standards, will be protected.

Post “draft” to open access, online journal: The credentialed author would next post the electronic draft of the manuscript to the online journal. All articles would be formatted in a manner to permit search engines to find them, including lists of keywords and other such attributes. At this point, no peer review has occurred and only two criteria have been met: a credentialed author and a standard format.

Open peer review of a draft article: Once an article is posted on the online journal, credentialed users of the journal would comment on the article using a comment-type system as found in blogs, but with full disclosure of the identity of the reviewer. Reviewers could also contact authors directly with comments or suggestions. Reviewing would be a voluntary, self-selecting, participatory process. Researchers would only read and comment on the articles that interest them. The online journal would maintain statistics on all the hits, downloads, and other activity related to the draft article. After a period of time, probably six months, the author(s) would prepare a final version of the article.

Revise and post “final” version: Following the review period, the author(s) would post the final version of the article. Commentary or blogging on the article would continue.

The “open publication – open access” system that I propose has several issues that need to be addressed. First is the question of who manages the online, open access journal or service? I don’t think it matters whether it’s done by private business, professional societies, or universities. The journal is simply a gateway not unlike other social media and should be managed in that manner. The author is responsible for the content.

Second is the question of who serves as the archivist? In my opinion universities must regain the upper hand as archivists for the research publications of their employees, mainly faculty members. We can debate whether and how the copyrights to research publications should remain the intellectual property of the university or the author(s), but it should remain with one or the other with the condition that the university is permitted to post the article on the web and is responsible for archiving the article. Again, I foresee the growth of startup internet companies whose role is to service the needs of university libraries to store and archive articles. Archiving is a major issue for the whole of the digital age. I have no doubt that solutions will emerge. Transformation of the publication system should not be held hostage to the problem.

Finally, who pays! Somebody pays. No publication system can sustain itself without someone paying for editors who validate format, for server farms that host the online journal, or for the required staffing at the Internet gateway. Personally, I see opportunities for entrepreneurs to move into this arena. Facebook, YouTube, and Google already have the resources and expertise to make the system function.

The true success of a research publication is not whether it passes through numerous gatekeepers or conforms to accepted standards. Success directly correlates with the content of the publication and the ability to replicate or authenticate the content. Success correlates with the impact of the content, even if wrong. Bad ideas often lead to the right ideas. Certainly bad science will get published, but have you read the research journals lately? The bright line between science and not-science is in no danger from my proposal. And so what if some junk science slips in? Bad ideas and crazy research will die of their own weight as they always have. We are in no danger here. We don’t need a gatekeeper, authoritarian system to weed it out through the massive review system that we currently have. That system is too expensive and takes too much time and effort with very little to show for it.

What about proper evaluation of faculty performance? Is it really necessary for us to have the current peer review publication system to serve as a surrogate for the functioning of tenure and promotion committees? If we free up the escalating time spent reviewing articles for publication that we’re not very interested in, we might actually have time to take a look at the publications of our fellow faculty members! That would be a good thing.

So, let’s agree that my system or some modification will satisfy the basic requirements of a publication system. What are its advantages? There are many. It removes unnecessary gatekeeping and vastly speeds up the “time to market” of research. It provides much needed open access. Ultimately, I believe it will reduce the enormous cost of journal subscriptions for university libraries. The world has changed and publishing is not the same anymore. Universities and their faculty need to take back that which is important to them – control of the dissemination of their research. The electronic information age provides us the means to do it – if only we give a little and rethink the gatekeeper mentality through open peer review. When we get used to the system, it will provide the same level of bright line for science versus not-science and about the same level of quality in content. We are the responsible party. We can and must make this transformation.

How will these changes affect the innovation ecosystem? They will reduce the workload of faculty and increase time spent on education of the workforce and on research. Open access to research speaks for itself as an accelerator of innovation. New startup internet companies will emerge to carry out the functional tasks of servicing the new publication system. As mentioned, I foresee “editing” companies and various forms of “cloud computing” through provision of servers and hardware. Credentialing and authentication systems will emerge. In essence we will disassemble the current publication system and put it back together through functionality as opposed to a single provider.

Will this open publication and open access R&D publication system work? Emphatically yes! It’s inevitable and it’s time to the transformation.

Thursday, June 9, 2011

The Missing Link: Lablets as Innovation Hubs


By Keith McDowell

Richard Dawkins in his wonderful book The Greatest Show on Earth: The Evidence for Evolution presents a compelling and overwhelming scientific and logical case for the theory of evolution. One of his major points is that the standard vernacular phrase used to encapsulate evolution, “man descended from the apes,” is not correct. The correct phase should be something like “man and the modern ape are descendants of a common ancestor, each representing different evolutionary branches from that ancestor.” I suspect that disbelievers will not be mollified by such a correction or change their minds. Nonetheless, scientists continue the search for all of our differentiated ancestors along the evolutionary tree, whether they link to apes or not. Unfortunately, to the chagrin of the science community, the search for the common ancestor became know as the search for “the missing link.”

Another aspect of evolution or systems that grow from genetic-like algorithms is the expectation that all available niches in the ecosystem will be filled after a sufficient passage of time. And if not, then the ecosystem can be forced or driven to fill a niche – much in the way that animals and specific crop varieties have been “bred” by human manipulation. The concept of “directed evolution” has even been applied to social systems and, more specifically, to innovation. As an example, Ideation International Inc. provides an entire suite of business services built on “directed evolution” as a tool to obtain a competitive edge.

Humankind’s quest to understand the dynamics of systems, no matter their character or nature, is a worthy and ongoing endeavor. Indeed, the very essence of innovation is the emergence of something new from a system, whether done by man for commercial purposes or by Mother Nature to invent the superbug that is resistant to all drugs. And specifically as a society, it behooves us to study and understand that which we call “the American innovation ecosystem.” It is an ecosystem created and driven by human manipulation through government rules and regulations and many other factors. But have we created the best of breed? Or have we driven ourselves to an evolutionary dead end?

Specifically, I posit the following question as one of many that we need to answer for ourselves.  Does the American innovation ecosystem have all available niches filled? Or, to twist the tail of the phrase “the missing link,” are we “missing a link?” Do we need to evolve a new breed of dog? My answer is emphatically yes!

In fact, it is quite easy to see that we are “missing a link” by observing the scale and scope of the R&D enterprise from individual investigators to large projects on the order of the Apollo Program or the Manhattan Project, both with respect to the number of people directly involved and the level of funding. At the present time there is a continuum in these two metrics from individuals to centers to large centers – the largest being the Clinical and Translational Science Award (CTSA) program of the National Institutes of Health. The CTSA program had made 46 awards as of October 2009 with an expectation of growing to 60. Typical awards ranged from $5 million to $10 million per year for five years. The common vision of CTSA consortium members is “to reduce the time it takes for laboratory discoveries to become treatments for patients, to engage communities in clinical research efforts and to train clinical and translational researchers.”

As we move along the axis defined by a combination of the two metrics, people and funding, we find a dip nearly to zero before we encounter national laboratories and large scale projects on the order of a thousand people and a billion dollars. It’s the missing link – an unfilled niche in the innovation ecosystem.

Some would argue that industrial R&D laboratories fill the niche and they do to some extent, although many have argued for the demise of basic research within such industrial labs.

Even more interesting, although difficult to prove easily, many of our societal grand challenges require a scale and scope perfectly matched to the “missing link.” And we wonder why America is falling behind in the innovation game!

But there is a solution vector to fill the niche caused by the “missing link.” It’s a new entity or, if you like, breed of dog called by some “lablets” and by others “innovation hubs.” I personally prefer “lablets” because “innovation hub” is a separate and equally important concept requiring a name. Indeed, lablets would be part of an innovation hub.

The concept of “lablets” is not new and has been under discussion and development for the past decade. For example, a 2005 draft report from the National Academy of Engineering entitled Assessing the Capacity of the U.S. Engineering Research Enterprise addressed many of the features of a lablet. More recently, the U.S. Department of Energy introduced the notion of energy innovation hubs and funded several across the United States.

So, what is a lablet? A lablet is an entity competitively funded at a scale of $25 million per year, for a five-year term, with additional start-up funding of $10 million for space renovation, equipment, and instrumentation. The entity would directly fund approximately 100 researchers. As I see it, the basic purpose and essential features of a lablet encompass the following goals:

·      Create discovery-to-innovation institutes melding interdisciplinary research, education, outreach, and practice.
·      Engage in transformational, use-driven R&D targeted to address or solve a specific, identified, and vetted societal challenge structured to avoid “me too” research.
·      Form crosscutting “dream teams” of outstanding scientific leadership that can recruit and nurture extraordinary talent and instill high expectations.
·      Manage by “best practice” project management with oversight from an external advisory board.
·      Pursue “open innovation” emphasizing “gateways” rather than “gatekeepers.”
·      Form public-private partnerships composed of all innovation entities with each contributing resources, but with the federal government being the principal financial supporter.
·      Generate connectivity across universities, national laboratories, industry, research institutes, and other players in the innovation ecosystem with due consideration of differentiated missions and cultures. Such connectivity with the “lablet” as a hub promotes and accelerates commercialization.
·      Focus on trans-disciplinary challenges requiring transformational engineering to affect “the global, knowledge-driven society of the twenty-first century.”
·      Disperse geographically the institutes to make use of all resources and consequently enhance all elements of America.
·      Provide experiential learning for undergraduate and graduate students from “engineering, management, medicine, law and social sciences.”

Lablets provide the ultimate in flexibility and adaptability as the entities from which they are formed self-assemble into new lablets when societal grand challenges are solved and new ones are found. They are not fixed organizations that outlast their utility or serve as an ossified jobs program.

In the final analysis, support for lablets is all about culture and what we value. Do we want incentive systems that attract all the players because of access to a profoundly fun network that is funded and sustainable? Do we want communication and connectivity across all the players that increase the complexity index of the innovation ecosystem as an organism? Do we want a layered structure that allows investigators to function on multiple planes? Do we want societal grand challenges attacked in a manner that can get the job done and thereby enrich our lives? Do we want innovations that lead to commerce and economic prosperity? Do we want the jobs that will result from such activity? Of course we do! Government can affect these improvements by stepping in where others will not tread. Government – namely, you and I – must fund the innovation ecosystem. And I believe that the lablet concept is an essential new feature of that ecosystem. It’s the link that’s missing from our American innovation ecosystem.