Login

Crowd Innovation: Hype or Help?

Crowdsourcing at NASA: About the Work Behind Having Others Do the Work

Interview with Ryon Stewart, Challenge Coordinator at NASA’s Center of Excellence for Collaborative Innovation (CoECI)

NASA’s record of innovations is truly awesome. Every child knows about the first man on the moon and the space shuttle program, or marvels at images of outer space transmitted from NASA missions. It is less well known that even the world class engineers of NASA tap into the wisdom of crowds to solve their problems and invent groundbreaking solutions. In our interview, Ryon Stewart explains that innovation is less about a genius sitting at a desk and having a light-bulb idea, and more about finding solutions that already exist – somehow, somewhere. Learn how NASA uses crowd-power, why NASA’s workforce still won´t run out of work, and how even the bison at Yellowstone National Park contributed to problem solving.

download pdf

KURT MATZLER: NASA is known to have an extraordinarily skilled and talented workforce. How did you get the idea that anybody would be able to solve problems better than you can on your own?

Ryon Stewart: The idea was born back in 2009. In this year, the Human Health and Performance Director of NASA, Jeff Davis, had been at an executive training course at the Harvard Business School and learned about this method from Karim Lakhani, who is very well known in the crowdsourcing world. Karim basically inspired Jeff to utilize a crowdsourcing route. His organization had to handle a pretty drastic funding cut, and so he understood that crowdsourcing might be a means to help his R&D portfolio with less money than before.

And the early projects turned out to be successful?

Yes, they ran some pilots using the InnoCentive platform, and found that it was quite successful. Around the same time, the Human Exploration and Operations Chief Technologist at NASA Headquarters started doing similar pilots on the software and algorithm side, using Topcoder. These concurrent pilots were both having successes and so we went on.

You now work for the CoECI at NASA. So, you have a whole unit for crowd projects?

Right. At about the time of our first projects, the Obama administration was looking to take advantage of all the skills within the country and get the most for the taxpayer. They requested that NASA establish a center of excellence to help NASA and other government agencies take advantage of crowdsourcing. So, in 2011 the CoECI was officially started and we’re still here and helping lots of people and government agencies take advantage of crowdsourcing.

I assume, by now you have an impressive list of successful projects?

We operate the NASA Tournament Lab and with its contracts and mechanisms. We have done close to 400 challenges so far. This number includes internal and external projects of NASA and other federal agencies. So far, we have had about 25,000 unique submitters for idea solutions who made it the whole way to the finish line. In terms of registered participants, the number is around 200,000 for all the different things that we’ve done so far. And we’ve awarded prizes of about $6.5 million since 2009, with a lot of that coming from other federal agencies.

About Ryon Stewart
Ryon Stewart serves as Challenge Coordinator at NASA’s Center of Excellence for Collaborative Innovation (CoECI). He is responsible for fostering the use of open innovation tools at NASA and other parts of the Federal Government and coordinates crowdsourcing projects for multiple agencies. Ryon has an Aeronautics and Astronautics Engineering degree from the University of Washington and has worked at NASA's Johnson Space Center since 2008. His early responsibilities at NASA gave him hands-on experience with operations facilities, engineering GN&C (guidance, navigation & control), and engineering robotics. Later, he worked full time in ISS Flight Operations as a flight controller. In this role, he supported over 2200 hours of real time execution supporting activities from visiting vehicle dockings, undockings, reboosts, and other precision operations. He also worked as an instructor teaching flight controllers, instructors, and astronauts about the ISS motion control system and soft skills required for the job.  

About NASA’S Center of Excellence for Collaborative Innovation (CoECI)
The CoECI was established by NASA in 2011 at the request of the White House Office of Science and Technology Policy (OSTP).  CoECI guides NASA and other Agency teams on all aspects of implementing crowd-challenge-based initiatives, from problem definition, to incentive design, to post-submission evaluation of solutions. This service allows other agencies to experiment with new methods in a quick turnaround, before formalizing their own capabilities.  Since its inception, research into the use of crowdsourcing has been central to NASA’s efforts. All CoECI challenges are managed under the umbrella of the NASA Tournament Lab (NTL), which recently expanded its capabilities beyond software and algorithm development.  The NTL now offers a variety of open innovation platforms that engage the crowdsourcing community in challenges to create the most innovative, efficient and optimal solutions for specific, real-world challenges faced by NASA. 

The interviewer
Professor Kurt Matzler conducted the interview in November 2019.

 What would you consider your most successful project so far?

This is a tough question. We have a lot of interesting and very successful ones. Clearly, the challenge that got the biggest reach was the “Space Poop Challenge”. It was looking at trying to improve the methods for human waste management for long duration space activities. Space walks had never been longer than about seven hours and seven hours of waste management can be handled ok. The challenge was looking for a period of 144 hours. We got some interesting responses that helped continued development for future space suits. Obviously, people think of this subject as funny and the project went viral and got covered by basically every major US news media outlet.

And apart from publicity, are there any other projects that stand out?

Some projects on our internal crowdsourcing platform “NASA@WORK” helped us save a lot of money. NASA@WORK reaches out to the NASA crowd itself. We have had lots of situations where folks came to us with a problem they were ready to fund with a few million dollars and multiple years of development. And then, when they posted it on NASA@WORK, it turned out that someone at the same or in another NASA center already had the answer, at least partially. That has happened many, many times. As an example, someone at Johnson Space Center was looking for a way to better measure urine in microgravity. He was about to spend I think $1.3 million and three to five years of development and posted the problem. It turned out that somebody only a few hundred yards away at the same NASA center already had a prototype that had been developed for a different reason. He was able to respond on the NASA@WORK platform, and what they already had could be used. We have lots of situations where NASA@WORK broke through silos. This was really great, because wherever you work there are silos, sometimes even between team members.

How do you measure crowdsourcing success?

One thing we do is ask the challenge owners before the challenge what they think the project would cost, using traditional methods. Then, after the project, we have a closeout interview as well, where we ask them for the level of advancement: ”Was the solution advanced not at all, incrementally, significantly or was the problem actually solved?“ Anything from incremental advancement to solved is considered successful and we’ve seen a 94% success rate, which is huge. Approximately a third of our challenges reach incremental advancement, another third are significant advancement and one third is classified solved, which is crazy.

Do you collect metrics apart from success rates?

We also capture savings compared to traditional methods in a similar way.  In 80% of our challenges we see cost savings of on average 41%.  Utilizing us instead of doing what they would have done is actually like a negative cost for managers. In fact, we’ve saved about $32 million so far for NASA by utilizing the NASA Tournament Lab. You see, we capture interesting metrics and we have to utilize those to continuously sell crowdsourcing to everyone at NASA.

Wow. That´s impressive. Are you also able to implement all solutions?

Ninety-four percent of our solutions were implemented or planned to be implemented. This is another metric we look at. We have learned over the years how important this is, as a lot of folks don’t have a good plan for implementing their solutions. We work really hard with the challenge owners to make sure they have the clout and authority to get it done afterwards.  That’s very important, because if you’re given a novel solution but don’t do anything with it, then what was the point? Then it becomes extra-curricular, which is the stigma that crowdsourcing had lots of times anyways.

 

 

Do you run crowdsourcing projects yourself or do you collaborate with partners?

We collaborate with partners for pretty much everything we do. Even for our internal platform, there’s a vendor who owns the site and helps us update the platform and figure out problems. We collaborate with platforms like InnoCentive, Topcoder, Kaggle, Luminary Labs, HeroX. Right now, we have ten vendors; they all have different skills, and often self-select depending on what they know best. Once a vendor is selected, they do most of the work and this is a really good selling-point for the problem owners. The vendors help frame the problem, develop the contest and help execute it.

Who selects and evaluates the ideas and solutions? Do you delegate this to the vendors as well?

The vendors help select ideas by evaluating them against the defined requirements. Therefore, you have to make sure that your requirements are clear enough to pick different solutions. For an algorithm challenge, it’s mostly easy. You’re almost always just picking the highest score. But for ideation or creative-type challenges, it’s more difficult. Our contracts specify that vendors filter the ideas. We’ve had challenges with hundreds of responses from the general public, with each response up to 30 pages talking about something in space, and many nowhere near what we want. Even in a curated crowd, many responses are not good. The vendors are reading all the papers and identify which meet the requirements. Then they deliver those to our NASA teams and problem owners only have to read through those to pick a winner.

How do you motivate contributors to participate in your challenges?

For a lot of the challenges, that’s really up to the vendors. They understand their crowd and it’s their job to maintain and curate that crowd and keep people interested. That’s part of what we pay them to do. They’ll know how much money it takes to get the right kind of answers. If you offer too much money, the crowd might think the problem is too hard and their solution won´t be good enough. If you offer too little, then people might think it’s not worth their time.

What are typical problems that can arise during a project and how can they be solved?

Probably the biggest problems emerge when the challenge-owning team is not ready to take the solution. For instance, if the solution is a software application and they haven’t, ahead of time, coordinated the implementation with their IT. If people haven’t done all of the leg-work, you get a problem. The people in-house need to be able to integrate the new ideas into their platform and their architecture. People often are afraid of the idea of outsourcing work, but really there’s still lots of work to do even if you’re handed a solution.
 

Are there any problems with intellectual property issues? Don´t you fear giving away secrets that people would use for their own purpose?

Together with the vendors we define what IP (intellectual property) we want to give the solvers as part of the challenge. Usually, the more IP you’re willing to give, the better the solutions will be, especially for hard problems. If respondents are pretty much inventing something new, they might back out part way through and say,” I don’t want to win this prize. I’m going to go start my own company.” So you have to be careful on how much IP you’re willing to give. Generally, because we’re the government, we give a government use license where they’ll continue to give us, in perpetuity usually, the ability to use whatever that thing is, but we still allow them to go start a business on the side, if they want. For private industry it will probably be a little more complicated.

Data might be too sensitive to be shared. How do you handle this challenge?

For a data science type problem, you can change the labelling of some data or share only part of your dataset, to make it unclear. For instance, if we’re doing a challenge on astronaut health, we can’t share health data. So we make sure that we have just columns of numbers or we scale data differently. There are lots of things you can do to prevent people from interpreting exactly what it was originally.

Do you always reveal that it´s NASA that seeks a solution or does it sometimes make sense not to disclose who the sponsor is?

Obfuscating a problem can be very good in some cases. If we reframe the problem to disguise who we are or what the problem is, it gets hard for folks to know what’s really going on. A few years back, the CIA actually ran a challenge without listing themselves as the CIA. The challenge was to utilize only social media posts to track particular bison in Yellowstone National Park. But in fact they wanted to use their algorithm to help track Russian actors in Crimea, and were finally able to do so. They didn’t post it as that to avoid attracting bad actors submitting bad responses that might change the outcome. Getting people help you find bison in Yellowstone National Park is the same concept and really harmless. The vendors are very good at helping folks like me reframe and restructure those problems so that we need not worry about leaking too much intellectual property or sensitive data.

You mentioned several times that you actively sell the idea of crowdsourcing within NASA? Is it difficult to convince people to play along?

We do have the ”not invented here” syndrome or people who think that their problem is unique.  It’s a culture shift and a lot of folks believe that we at NASA know our stuff better than anybody else. We have to explain to people that just because you’re a chemist in a chemistry lab, a chemistry problem might not be the best thing for you to solve if you want a breakthrough solution. It might be, but if you put a problem out there, people will come with new things that you have never thought about. So we do a lot of work convincing them with roadshows where we present the general successes based on the metrics we collect. We also tell them all our case studies. A few years ago, for instance, a group was looking to improve the ability to send and receive large files in e-mail through to the International Space Station from Mission Control. So they really had to improve the networking protocol in space, like sending the internet to space more than before. It was a problem that seemed unsolvable for some of them. They got an answer using an external platform and were blown away that it worked. It’s now implemented and still being used on the Space Station today. So, we definitely try to shape the culture to help understand that crowdsourcing is good for NASA.

Are the engineers and scientists worried that they might get redundant because of crowdsourcing?

We show them that crowdsourcing is not taking away jobs: Maybe a design is coming from someone outside, but you’re the one who has to integrate that design, you’re the one who can build off of that design. Often, a design or an idea isn’t your end goal – it’s just changing where you start. Even not getting good solutions can validate that NASA was doing the right thing, because not even the rest of the world could think of anything better.

For which problems does crowdsourcing provide the greatest benefits to NASA?

Contests work well when the combination of skills or even the technological approach are not obvious. Often, trying something and not getting the solution can be due to inherent biases that you are unaware of. So, reaching out to a large crowd through experts can help re-frame the problem such that those inherent biases disappear. You can get many diverse skills and backgrounds. And maybe they end up being smarter than your technical domain. The right combination of skills and backgrounds will be out there. They will shoot new perspectives at you and you might get big and really good solutions. If the problem is very well-defined, you know for sure that only a certain kind of work could get it done, then contests might not be the right route.

Finally, based on your experience with many crowdsourcing projects, which advice would you give an organization that is planning to start with crowdsourcing?

If a company wants to start their own crowdsourcing projects or start a Center of Excellence, they have to be flexible and patient. You have to be able to understand different kinds of problems and handle a lot of rejection, because crowdsourcing will be new. It’s not traditional and often scary, like we were just talking about. At least to start off with, you should take advantage of other platforms and their crowds. Definitely capture metrics for basically everything you do. Think about metrics that are important to your stakeholders and to your potential challenge owners and use them to explain to people why crowdsourcing is good. But then also find interesting stories that can go along with those numbers. Really good case studies are a great way to convince.

Thanks so much for sharing your crowdsourcing insights and success stories with us. We sure hope to read about further extraordinary projects in the media soon. And with the help of the crowd, I am sure, you will send humans to Mars not too long from now.