Sunday, June 18, 2017

When and Which Version to Adopt a Library: A Case Study on Apache Software Foundation Projects

By: Akinori Ihara, Daiki Fujibayashi, Hirohiko Suwa, Raula Gaikovina Kula, and Kenichi Matsumoto (Nara Institute of Science and Technology, Japan)
Associate editor: Stefano Zacchiroli (@zacchiro)

Are you currently using a third-party library in your system? How did you decide the version and when to adapt the library? Is it the latest version or the older (reliable) version that you adopted? Do you plan to update and would you trust the latest version? These are all tough questions with no easy answers.

A software library is a collection of reusable programs, used by both industrial and open software client projects to help achieve shorter development cycles and higher quality software [1]. Often enough, most active libraries release newer and improved versions of their libraries to fix bugs, keep up with the latest trends and showcase any new enhancements. Ideally, any client user of a library would adopt the latest version of that library immediately. Therefore, it is recommended that a client project should upgrade their outdated versions as soon a new release becomes available.

Developers do not always select to adopt the latest version over previous versions

As any practitioner is probably well-aware, adoption of the latest version is not as trivial as it sounds, and may require additional time and effort (i.e., adapting code to facilitate the new API and testing) to ensure successful integration into their existing client system. Developers of client projects are especially wary of library projects that follow a rapid-release style of development, since such library projects are known to delay bug fixes [2]. In a preliminary analysis, we identified two obstacles that potentially demotivate client users from updating:
  1. Similar client users are shown not to adopt new version shortly after it is released and that
  2. there is a delay between the library release and its adoption by similar clients.
These insights may indicate client users are likely to 'postpone' updating until a new release is deemed to become 'stable'. In this empirical study, we aim to investigate how libraries are selected in relation to their release cycles.

Design: We analyze when and which library versions are being adopted by client users. From 4,815 libraries, our study focuses on the 23 most frequent Apache Software Foundation (ASF) libraries used by 415 software client projects [3].

Figure 1: distribution of the periods between releases in each library

When to adapt a library?: We find that not all 23 libraries were yearly released (see Figure 1). Some library projects (e.g., jetty-server, jackson-mapper-asl, mockito-all) often release new versions within the year (defined as quick-release libraries), while others (e.g., commons-cli, servlet.api, commons-logging) take over a year to come out with a release (defined as late-release libraries). We found that these more traditional and well-established (i.e., older than 10 years) projects were the late-release libraries, while newer, beginner projects belonged to the quick-release libraries.


Figure 2: Percentage of client users to select the latest version (gray) and the previous version (black)

Which version to adopt?: Software projects do not always adopt new library versions in their projects (se Figure 2). Interestingly, we found that some client users of a late-release library would first select the latest version as soon as it was released, only to later on downgrade to a previous version (Figure 2: Red box and blue box shows the percentage of client users which performed downgrade after adapting the latest version or the previous version).

Lessons Learnt: From our study, we find that client users may postpone updates until a library is deemed to become stable and reliable. Although quality of most open source software would often improve by minor and micro release changes, the study finds that client projects may wait, especially in the case of a late-release library. Our study validates the notion that library updates is not trivial. We find that practitioners are indeed careful when it comes to adopting the latest version, as they may include dependency problems and potentially untested bugs.

We presented this study in International Conference on Open Source Systems (OSS'17). For more details, please see the preprint and the presentation from our website: http://akinori-ihara.jpn.org/oss2017/

[1] Frank McCarey, Mel Ó Cinnéide, and Nicholas Kushmerick, "Knowledge reuse for software reuse," Journal of Web Intelligence and Agent Systems, pp.59-81, Vol.6, Issue.1, 2008.
[2] Daniel Alencar da Costa, Surafel Lemma Abebe, Shane McIntosh, Uira Kulesza, Ahmed E Hassan "An Empirical Study of Delays in the Integration of Addressed Issues," In Proc. of the 30th IEEE International Conference on Software Maintenance and Evolution (ICSME'14), pp.281-290, 2014.
[3] Akinori Ihara, Daiki Fujibayashi, Hirohiko Suwa, Raula Gaikovina Kula, and Kenichi Matsumoto, "Understanding When to Adapt a Library: a Case Study on ASF Projects," In Proc. of the International Conference on Open Software Systems (OSS'17), pp.128-138, 2017.

Monday, June 12, 2017

Supporting inclusiveness in diverse software engineering teams with brainstorming

By: Anna FilippovaCarnegie Mellon University. USA (@anna_fil)

Associate Editor: Bogdan Vasilescu, Carnegie Mellon University. USA (@b_vasilescu)


Diversity continues to be one of the most talked about issues in software engineering. It is a paradox – we understand that diversity is important not just for equity and increasing the pool of available candidates, but because it improves the quality of engineering. However, in practice, diverse teams struggle with the very thing that makes them so important – voicing differing or dissenting opinions. Because the benefits of diversity depend on everyone speaking up, it is important to create supportive group processes that ensure all team members can voice their opinions without fear of judgement or being ignored.


In this post, we describe one strategy that is likely already in an engineering manager’s toolkit – brainstorming.

The diversity paradox
It is well established that diverse teams are more creative and better at problem solving because they can leverage varied life experiences to make unexpected connections and avoid groupthink through constructive criticism. They are therefore particularly important in contexts where creative problem solving is required, such as in solving engineering challenges. The advantages of diversity come not only from inherent traits (such as someone’s gender, or race), but also through acquired experiences (like education or living in different places), and it is important to support both in teams.

However, numerous research has shown that diverse teams struggle with leveraging their full potential – in unconstructive environments, team members who are in a minority struggle with feelings of intimidation or being ignored, while clashes in backgrounds between different factions in a team result in misunderstanding, suspicion and conflict. In the short term, this impacts the effectiveness of diverse teams, while in the long-term it could lead to greater intentions to leave the software engineering profession for minorities, especially in the early stages of their careers.

While we have made significant strides in improving representation at different levels of the pipeline, representation alone does not guarantee an effective team. It is important to think beyond supporting diversity through numbers alone, towards inclusive group processes through which minority individuals and challenging opinions are not only welcomed, but systematically integrated into the bigger picture.  

Brainstorming: an accessible strategy for diverse teams

Though we can take several different approaches towards more inclusive group processes, it is helpful to consider strategies managers may already be familiar with. Brainstorming is one such well-known technique designed to support innovation in teams with 4 core principles:
1)    Focusing on idea generation and discussion in a way that
2)    withholds judgement, and
3)    supports any ideas no matter how controversial, while
4)    encouraging the integration of all the ideas proposed rather than discarding them.

In other words, brainstorming supports exactly the kind of environment minority members of diverse teams need in order to feel comfortable voicing dissenting opinions without fear of judgement, criticism or being ignored. Despite this promise, little empirical work had looked at the impact of brainstorming on teamwork in diverse groups to-date.

In a recent study, we observed the effects of brainstorming on satisfaction in a short-term, time intensive group work setting. Our study involved 144 participants across two non-competitive hackathons in the software engineering domain.  

We found that brainstorming supported 1) better satisfaction with the process of working in the team and 2) a clearer vision of the team goals for all team members, regardless of their minority status, but the effect was significantly stronger for minority team members.

Without brainstorming, team members who described feeling like a minority in their group (we did not distinguish between inherent and acquired traits) felt less satisfied with the process of working in their groups, and were less clear about what their group aimed to produce, compared to their teammates. However, as Figures 1 and 2 illustrate, in teams that did utilize brainstorming, minority team members matched their teammates in terms of satisfaction and alignment with group goals.
Figure 1 The impact of brainstorming on satisfaction with working in the team by participant minority status
Figure 2 The impact of brainstorming on goal clarity by participant minority status 

Key takeaways

Brainstorming is a readily available technique that managers are likely already familiar with, and, as our findings suggest, helps diverse teams work better together. In fact, because brainstorming supports satisfaction and a clearer vision of the team goals for all members of the team, there is reason to take a second look at the technique even if you are not yet managing a diverse team.

References:

Nigel Bassett-Jones (2005), The Paradox of Diversity Management, Creativity and Innovation. Creativity and Innovation Management, 14: 169–175.

Anna Filippova, Erik Trainer, James D. Hersbleb (2017) From diversity by numbers to diversity as process: supporting inclusiveness in software development teams with brainstorming. In Proceedings of the 39th International Conference on Software Engineering, ACM, New York.

Elizabeth Mannix, Margaret A. Neale, (2005). What differences make a difference? The promise and reality of diverse teams in organizations. Psychological science in the public interest, 6(2), 31-55.

Alex Osborn (1957) Applied imagination: Principles and procedures of creative problem-solving. C. Scribner’s Sons; Revised second edition.

Carroll Seron, Susan S. Silbey, Erin Cech, Brian Rubineau (2016) Persistence Is Cultural: Professional Socialization and the Reproduction of Sex Segregation. Work and Occupations, 43:2, pp. 178 – 214.

William A. Wulf. (2002), The Importance of Diversity in Engineering
in Diversity in Engineering: Managing the Workforce of the Future. The National Academy of Engineering (eds.) Washington, DC: The National Academies Press.


Monday, June 5, 2017

IEEE March/April Issue, Blog, and SE Radio Summary

Associate Editor: Brittany Johnson (@drbrittjaydlf)

The March/April issue of IEEE Software again presents readers with a range of software topics that are currently being explored, researched, and improved. Some of the articles in this issue discuss topics such as software testing and requirements, but the focus of this issue was crowdsourcing in software engineering.

Guest Editor Klaas-Jan Stol, Thomas LaToza, and Christian Bird introduced the feature topic in this issue with the article "Crowdsourcing for Software Engineering". In this article, the authors discuss what crowdsourcing is and introduce the articles in this issue that elaborate on new opportunities, and challenges, that come with crowdsourcing in software engineering. This issue includes the following articles on crowdsourcing in software engineering:


StackOverflow is becoming more popular as a data source to better understand how developers work, from understanding what prevents developers from posting questions and answers [1, 2] to what makes a good answer on StackOverflow [3]. The authors of "What Do Developers Use the Crowd For? A Study Using StackOverflow" took a slightly different approach from previous studies and explored in what situations developers use the crowd available on StackOverflow. Rather than mining StackOverflow responses, the authors mined GitHub commit histories looking for explicit mentions of StackOverflow in their commit comments. They found, as expected, that developer most often use the crowd on StackOverflow to gain knowledge on topics such as development tools, APIs, and operating systems. Other not so expected reasons that they found developers use StackOverflow includes promoting StackOverflow to other developers and providing rationale for a feature updates or additions. This is just one of the many areas of research that helps showcase the advantages of crowdsourcing for software engineering.

But of course, as with any new research direction or opportunity, there needs to be research that explores the challenges that come with it. The article "Barriers Faced by Newcomers to Software-Crowdsourcing Projects" sheds some light on barriers encountered by newcomers on software projects that use crowdsourcing to implement parts of their project. The authors conducted quantitative and qualitative analyses, and found that some of the more frequent barriers that newcomers encounter include issues like lack of documentation, poor task management, and problems understanding the code structure or architecture. The authors also propose some guidelines for how we might help developers overcome these barriers. One of their proposed guidelines suggests that task management can be improved by better matching tasks to developers with the necessary expertise. The authors suggest using task complexity to determine tasks appropriate for newcomers. Another alternative, as suggested by existing research, would be to determine newcomer expertise and find tasks that match their expertise [4]. They key is to be able to help others more efficiently contribute to software-crowdsourcing projects.

This was a particularly special issue, as the editors included a special thanks from the editors to the numerous people that reviewed articles for the magazine in the past year. Another thanks to those who have taken time to help make IEEE Software great!

IEEE Software Blog

There are some interesting posts featured for March and April, mostly focused on frameworks and tools for improving the software development process. Peter Chen of Queen's University discussed a framework called CacheOptimizer that provides instrumentation for finding optimal cache configurations when using application-level caching frameworks. Yossi Gil and Matteo Orrû presented a tool called The Spartanizer. Their tool helps developers code in the "Spartan Programming Style" -- in this style of coding, programmers phrase their code statements such that they say the most in the fewest words. Finally, Mark Basham of Diamond Light Source discussed ways we can scale up the software development process, recommending revisiting classic CS books such as "The Mythical Man Month" and "Peopleware".

SE Radio

In this issue, SE Radio welcomed a new host, Kim Carter. For his first SE Radio episode, Kim spoke with Francois Raynaud on DevSecOps and improvements that should be made to existing delivery approaches. In March and April, SE Radio also featured episodes pertaining to system quality, including episodes on testing, system failures, and distributed storage.
There are also some episodes in this issue for learning new skills or technologies, including an intro to machine learning with Katie Malone and success skills for architects with Neil Ford.



[1] Ford, D., Smith, J., Guo, P. J., & Parnin, C. (2016, November). Paradise unplugged: Identifying barriers for female participation on stack overflow. In Proceedings of the 2016 24th ACM SIGSOFT International Symposium on Foundations of Software Engineering (pp. 846-857). ACM.
[2] Asaduzzaman, M., Mashiyat, A. S., Roy, C. K., & Schneider, K. A. (2013, May). Answering questions about unanswered questions of stack overflow. In Mining Software Repositories (MSR), 2013 10th IEEE Working Conference on (pp. 97-100). IEEE.
[3] Nasehi, S. M., Sillito, J., Maurer, F., & Burns, C. (2012, September). What makes a good code example?: A study of programming Q&A in StackOverflow. In Software Maintenance (ICSM), 2012 28th IEEE International Conference on (pp. 25-34). IEEE.
[4] Johnson, B., Pandita, R., Murphy-Hill, E., & Heckman, S. (2015, August). Bespoke tools: adapted to the concepts developers know. In Proceedings of the 2015 10th Joint Meeting on Foundations of Software Engineering (pp. 878-881). ACM.