A crowdsourcing platform opens up research on a global scale
There’s nothing unusual about the fact that Chiraag Sumanth is studying computer science at Stanford, except for how he got here, jumping from a solid but lesser known college in India to one of the most competitive master’s programs in the world.
Sumanth’s journey started about two years ago when he responded to a Facebook post inviting people worldwide to join a new Crowd Research Initiative directed by faculty at Stanford, the University of California, Santa Cruz, and Cornell Tech. The initiative sought to create a web-based platform for carrying out research on a global scale, while also giving more people a chance to work with professors at top institutions to gain experience and mentorship that could advance their careers.
Now, the 23-year-old from Bangalore is almost done with the master’s program he was able to join thanks to his work as one of the 1,500 students from 62 countries who took part in the first studies supervised under the Crowd Research Initiative.
“Chiraag helped to create a game-like interface that made it fun for participants to take part in our online study,” said Sharad Goel, an assistant professor of management science and engineering who supervised Sumanth, long-distance, and recommended him for admission to the master’s program.
Crowd organization
Millions of students worldwide find themselves in Sumanth’s position, with academic ambitions that might exceed the options open to them by virtue of where they go to school. That same predicament once described Rajan Vaish, who oversaw the Crowd Research Initiative as a postdoctoral scholar with Michael Bernstein, a Stanford computer scientist whose research focus is on using the internet as new medium to organize human activity.
“To find a position as an engineering or science researcher you need letters of recommendation,” said Vaish. “We realized that social media platforms and crowdsourcing models provided a way to expand access to mentorship, while also producing valuable research in the process.”
Vaish will summarize the Initiative’s early results at an Association for Computing Machinery conference in Quebec City, Canada, where the paper won a Best Paper Honorable Mention. It was Vaish, remembering challenges he faced many years ago, who first began work on the Initiative. At the time, he was a PhD student in the lab of UC Santa Cruz computer scientist James Davis, who remains a leader of the effort.
Along with collaborators at MIT Media Lab and Cornell Tech, the Stanford and UC Santa Cruz professors leading the Initiative have sponsored a series of crowd research projects over the last two years. Several experiments, focused on human-computer interaction, data science and computer vision, have since resulted in papers accepted through peer review at top-tier conferences in computer science. From this work, 33 participants got letters of recommendation from one of the supervising faculty to help in their efforts applying for graduate school or jobs. Two-thirds of those ended up gaining admittance at top global universities, while dozens who didn’t get letters still gained research experience that will aid their careers at home.
“For many participants, our letter of recommendation was the only one they had from a globally recognized university,” Bernstein said. “So, yes – I think we’re seeing a way to broaden the ranks of researchers in the interests of social justice.”
Putting the crowd into science
Crowdsourcing in research isn’t new. For years, so-called “citizen science” efforts have marshaled large groups to devote brain power or computer resources to such tasks as surveying birds, doing mathematical proofs, searching for radio signals of extraterrestrial life or divining the mysteries of how proteins change their shape.
But most citizen science studies leverage large numbers of participants to do rote work. The Crowd Research Initiative sought to give participants more ways to influence the subjects and directions of research, thus demonstrating the quality of their thinking and technique.
“At top universities, PhD-level research is more a voyage of open-ended exploration and discovery and we wanted participants to experience that,” Bernstein said.
The Initiative recruited participants via a web page that described the project and identified the principal investigators and associated universities. The distributed research team organized work and communicated through online discussion platforms, then discussed progress and milestones at a globally streamed video meeting.
As research matured, decisions about which authors get credit and which participants get recommendation letters became important. Vaish and his team introduced a decentralized credit approach where each participant shared their assessment of who contributed to the project. An algorithm merged the feedback into a global ranking, giving the principal investigator a concrete comparison in their recommendation letter.
Transformative experience
For Sumanth and other participants, the results were transformative.
“At first I could hardly believe I was here,” said Sumanth, now almost done with his studies, and looking forward to applying his learning back home. Asked what struck him about the U.S. instructional style, he paused for a moment.
“Here everything is practical and hands on, and yet casual and conversational,” he said. “There is not the same sense of the lecture that I remember, but the professors have a way of getting students enthusiastic about ideas and concepts in ways that stay with them outside of the classroom.”
For their part, Bernstein and Vaish are already recruiting principal investigators for a new set of Crowd Research projects, and they hope to venture beyond computer science research projects to create opportunities for students interested the social sciences, the humanities and sciences.
“Ultimately we have two goals,” Vaish said. “To pursue research at uncommon scale, and to give the participants opportunities that will change their lives.”
Additional Stanford co-authors are Geza Kovacs, Ranjay Krishna, Imanol Arrieta Ibarra and Camelia Simoiu. Additional authors include Snehalkumar “Neil” S. Gaikwad of MIT Media Lab, and Andreas Veit, Michael Wilber and Serge Belongie of Cornell Tech. This work was supported by the Office of Naval Research, the University of California, Santa Cruz, Los Alamos National Laboratory, Toyota and the Hasso Plattner Design Thinking Research Program.