The 4th International Workshop on

Crowd Sourcing in Software Engineering

Buenos Aires, Argentina & May 22, 2017

Held in conjunction with ACM/IEEE International Conference on Software Engineering 2017


Call for Papers

See our previous editions: CSI-SE 2016, CSI-SE 2015, CSI-SE 2014.

Keynote Speakers


Prof. Rick Kazman

University of Hawaii; Principal Researcher, SEI/CMU

On Leading a Crowdsourced Project

Software engineering is increasingly a community effort and its success depends on balancing many factors: distance, culture, global engineering practices, and more. However there are many socio-technical factors that may increase “social smells” and “architecture debt” and these may lead to employee turnover, reduced effort, high error rates, increase churn and, in the end, failed projects. Such factors therefore require attention from community leaders—architects, project managers, and committers in open-source communities—to mitigate the effects of such debts. To understand and support the social health of a software development community, I will discuss two concepts that may change how crowdsourced projects are designed, analyzed, and managed. The first concept is a community quality model—a set of metrics to track organizational and socio-technical qualities. I will present research that shows how, by tracking these qualities you can determine potentially smelly projects. The second concept is architect as rule-maker. Although many open source projects are currently organized in bottom-up fashion, I will argue that the best projects start with an architect who creates and enforces design rules and who monitors social smells.



Prof. Mark Harman

Facebook; University College London

Applications, Achievements and Avenues for Future Work in Crowdsourced Software Engineering

This talk will provide an overview survey of recent results in crowd sourcing for software engineering. It will cover recent results in applications across the spectrum of software engineering activity, and open problems and challenges for the software engineering community. The talk is based on the recently-published survey paper on crowdsourced software engineering: "A survey of the use of crowdsourcing in software engineering”, by Ke Mao, Licia Capra, Mark Harman, and Yue Jia, which is available from the Elsevier website: http://sciencedirect.com/science/article/pii/S0164121216301832.


Program


08:30-09:00 Welcome & Opening

09:00-10:00 Keynote 1: On Leading a Crowdsourced Project
Prof. Rick Kazman

10:00-10:30 Paper Session 1
The Good, the Bad and the Ugly: an Onboard Journey in Software Crowdsourcing Competitive Model
Leticia Machado, Alexandre Zanatta, Sabrina Marczak, Rafael Prikladnicki

10:30-11:00 Coffee Break

11:00-12:00 Paper Session 2
Crowd-Based Programming for Reactive Systems
David Harel, Idan Heimlich, Rami Marelly and Assaf Marron

Improving Model Inspection with Crowsourcing
Dietmar Winkler, Marta Sabou, Sanja Petrovic, Gisele Carneiro, Marcos Kalinowski and Stefan Biffl

12:00-13:30 Lunch Break

13:30-14:30 Keynote 2: Applications, Achievements and Avenues for Future Work in Crowdsourced Software Engineering
Prof. Mark Harman

14:30-15:30 Paper Session 3
Preliminary findings on Software Engineering Practices in Civic Hackathons
Kiev Gama

Prioritizing User Feedback from Twitter: A Survey Report
Emitza Guzman, Mohamed Ibrahim and Martin Glinz

Enriching Capstone Project-based Learning Experiences using a Crowdsourcing Recommender Engine
Juan Diaz-Mosquera, Andres Neyem, Pablo Sanabria, Denis Parra and Jaime Navon

15:30-16:00 Tea Break

16:00-17:00 Breakout Group Discussions


-

Accepted Papers


Juan D. Diaz-Mosquera, Pablo Sanabria, Andres Neyem, Denis Parra, Jaime Navon.
Enriching Capstone Project-based Learning Experiences using a Crowdsourcing Recommender Engine

Dietmar Winkler, Marta Sabou, Sanja Petrovic, Gisele Carneiro, Marcos Kalinowski, Stefan Biffl.
Improving Model Inspection with Crowdsourcing

David Harel, Idan Heimlich, Rami Marelly and Assaf Marron.
Crowd-Based Programming for Reactive Systems

Emitza Guzman, Mohamed Ibrahim, Martin Glinz.
Prioritizing User Feedback from Twitter: A Survey Report

Kiev Gama.
Preliminary findings on Software Engineering Practices in Civic Hackathons

Leticia Machado, Alexandre Zanatta, Sabrina Marczak, Rafael Prikladnicki.
The Good, the Bad and the Ugly: an Onboard Journey in Software Crowdsourcing Competitive Model

Call for Papers


Workshop Theme

A number of trends under the broad banner of crowdsourcing are beginning to fundamentally disrupt the way in which software is engineered. Programmers increasingly rely on crowdsourced knowledge and code, as they look to Q&A sites for answers or use code from publicly posted snippets. Programmers play, compete, and learn with the crowd, engaging in programming competitions and puzzles with crowds of programmers. Online IDEs make possible radically new forms of collaboration, allowing developers to synchronously program with crowds of distributed programmers. Programmers' reputation is increasingly visible on Q&A sites and public code repositories, opening new possibilities in how developers find jobs and companies identify talent. Crowds of non-programmers increasingly participate in development, usability testing software or even constructing specifications while playing games. Crowdfunding democratizes choices about which software is built, broadening the software which might be feasibly constructed. Approaches for crowd development seek to microtask software development, dramatically increasing participation in open source by enabling software projects to be built through casual, transient work. CSI-SE seeks to understand how crowdsourcing is shaping and disrupting software development, shedding light on the opportunities and challenges. We encourage submissions of studies, systems, and techniques relevant to the application of crowdsourcing (broadly construed) to software engineering.

Topics of Interest


Topics of interest include, but are not limited to:
  • Techniques for performing software engineering activities using microtasks
  • Techniques that integrate crowd knowledge into automated software engineering techniques
  • Techniques and systems that enable non-programmers to contribute to software projects
  • Open communities and systems for sharing knowledge such as Q&A sites
  • Techniques for publicly sharing and collaborating with snippets of code
  • Web-based development environments
  • Systems that collect and publish information on reputation
  • Empirical studies on use of crowdsourcing in software engineering
  • Crowd funding software development
  • Programming competitions and gamification of software development
  • Techniques for motivating contributions and ensuring quality in systems allowing open contribution

Submission

CSI-SE welcomes three types of paper submissions:

Full papers

max. 7 pages. Describing in-depth studies, experience reports, or tools for crowdsourcing including an evaluation; these submissions should describe new work relevant to crowdsourcing for software engineering.

Short papers

max. 4 pages. Describing early ideas with appropriate justification, preliminary tool support, or short studies that highlight interesting initial findings.

Position Papers

max. 2 pages. These are short contributions that can present more speculative ideas than the other two types of contributions. Sound reasoning is important, but no full justification or evaluation of ideas is necessary. This type of submissions is to encourage novel and visionary contributions that have not been developed in-depth.

Review

Each paper will be reviewed by three members of the program committee. Accepted papers will appear in the ICSE Companion Volume proceedings and be presented at the workshop.

Format and Submission Site

All papers must conform, at time of submission, to the ICSE 2017 Formatting Guidelines. All submissions must be in PDF format and should be submitted electronically through EasyChair https://easychair.org/conferences/?conf=csise2017.
Accepted papers will be published as an ICSE 2017 Workshop Proceedings in the ACM and IEEE Digital Libraries. The official publication date of the workshop proceedings is the date the proceedings are made available in the IEEE Digital Library. This date may be up to two weeks prior to the first day of ICSE 2017. The official publication date af- fects the deadline for any patent filings related to published work.

Important Dates

Submissions due: January 27, 2017 (Extended)
Notification to authors: February 17, 2017
Camera-ready copies of accepted papers: February 27, 2017

Workshop Organizers

Thomas LaToza

George Mason University, USA
CSE-SE'17 Co-Chair

Ke Mao

University College London, UK
CSE-SE'17 Co-Chair

Program Committee

Raian Ali (Bournemouth University, UK)
Alessandro Bozzon (Delft University of Technology, Netherlands)
Bora Caglayan (University of Limerick, Ireland)
Lydia Chilton (Stanford University, USA)
Schahram Dustdar (Vienna University of Technology, Austria)
Ethan Fast (Stanford University, USA)
Mark Harman (University College London, UK)
Fabrizio Pastore (University of Milano-Bicocca, Italy)
Rafael Prikladnicki (PUCRS University, Brazil)
Kathryn Stolee (North Carolina State University, USA)
Patrick Wagstrom (IBM TJ Watson Research Center, USA)
Ye Yang (Stevens Institute of Technology, USA)
Tao Yue (Simula Research Laboratory, Norway)

Steering Committee

Gordon Fraser (University of Sheffield, UK)
Klaas-Jan Stol (University of Limerick, Ireland)
Leonardo Mariani (University of Milan Bicocca, Italy)

Workshop Venue (ICSE 2017 Co-located)