A new member is being elected to the VIS Steering Committee (VSC) and the VIS Executive Committee (VEC), which provide scientific and organizational oversight of the IEEE VIS conference and reviewing process.
Following the example set by Society for the Improvement of Psychological Science, members of the Transparent Statistics in HCI asked 2021 VSC and VEC candidates to publicly answer a question about research transparency and open practices:
If elected to the IEEE VIS [Committee], what (if any) policies would you promote to improve research in visualization, and how would you support open science practices and research transparency at IEEE VIS and in the field of visualization more broadly?
Because different areas within visualization vary considerably in existing norms and practices, and because the specific issues that are most salient may vary by sub-discipline, the question is very broad and open-ended. The email was signed by Steve Haroz, Fanny Chevalier, Lewis Chuang, Pierre Dragicevic, Shion Guha, and Matthew Kay.
Statements will be posted as they are received.
VIS Steering Committee
Chris Johnson
Statement not yet received (request sent Sept. 13)
Kwan-Liu Ma
Even though VIS has created a set of well thought through guidelines for both authors and reviewers, we should not stop improving the paper review and selection process since this process is so crucial to the advancement of the field. I found the overemphasis of thorough evaluation and user studies for TVCG track papers often leads to acceptance of papers making only incremental contributions while rejecting papers presenting novel ideas. I like how other scientific disciplines recognize conference presentations focusing on original ideas and consider archiving subsequently. I would like to further promote the short papers or bring back a highly recognized conference track program. When asked to serve on this year’s VIS short papers committee, I was glad to find the emphasis of new and novel contributions rather than treating short papers as second class. So it’s a good start but not enough. I will also propose to re-examine the current VIS paper reviewing process for further improving the quality of the reviews and selections.
I am fully behind open science practices and research transparency. However, we need to be careful and reasonable in setting the expectation and educating both the researchers and reviewers about open science and reproducibility. All these require a community effort, and VIS/VGTC should lead and sponsor this effort. Therefore, I would support organizing and offering regular VIS conference workshops/tutorials as well as the development of online instruction materials on research transparency. Compared to 10 years ago, we have more basis and resources to teach visualization, but what is considered comprehensive education for visualization researchers? As a field, we should collectively develop a curriculum that includes transparent research in VIS.
Bernhard Preim
I consider open science, e.g., open source software and open data, as essential primarily to foster reproducibility. The fact that many scientific results cannot be reproduced, is a severe problem for science in general. Thus, I consider the Graphics Replicability Stamp Initiative as an essential step in this direction and would support further initiatives to strengthen transparency. However, I am also aware of differences in the subcommunities. While open source software and public databases play an essential role in BioVis and BioInformatics, in medical applications, it is still rare that data can be made publicly available. Without example datasets, even an open source software is of limitied value. Thus, I would not favor that open source software is mandatory for publications at Vis.
Anna Vilanova
As mentioned in http://ieeevis.org/year/2021/info/vsc-candidates, one of my priorities would be keeping high-quality standards in our conferences and reviewing process while being supportive, open, broad and welcoming to new research developments. It is important to keep supporting developments in existing fields while embracing new ones. Furthermore, a good balance should be found between fundamental and applied research, both being very necessary. Diversity should be promoted including many aspects: research, gender, age, origin… The IEEE Vis community should be proud, and do what is in its hands to support the career of our young talented researchers. I would like to promote policies that help in giving visibility and let our young researchers shine (e.g., through awards or recognitions).
I am in favor of making publication and research result easy available to our community and society in general, who are after all our main investors. I consider transparency and reproducibility a very important aspect to advance in a research field. For the field of visualization, reproducibility is often difficult to achieve, and I think more could be done to improve it. Policies that support and promote to make open code and data sets could be done, but also it could be requested that software is always provided to reproduce the results presented, especially for some types of papers. Exposure could also be increased for authors and publications that make their code and data available. However, we have to keep it realistic, and many of us are working with sensitive (medical, security, … ) data and in collaboration with companies, which makes the openness in publication specially of data rather difficult to not say impossible. So the policies should promote the openness and motivate it, but should not be hindering publications from specific fields or research that comes from industry.
VIS Executive Committee
Rita Borgo
OpenScience, OpenAccess, Opendata are all fundamentals for the progress of scientific research and knowledge, and milestones we really should hope to achieve. We see striking examples of when this is not the case and patents are placed ahead of people’s wellbeing.
So, am I in favour of OpenScience? Yes, I am as I believe all of us are.
Have I been following policies development? To the extent of my abilities, I have.
I am familiar with the Open Data Institute (ODI) initiatives, the Organisation for Economic Co-operation and Development (OECD) work on Open Government Data (OGD) policies and the Center for Open Science (COS). I have also followed initiatives emerging in our own communities like Harvard Dataverse, and contributed to my own local community with work on the London Data Store.
Nevertheless I have also found myself on both end of this challenge, chasing after access to data in support of my students’ work as well as my students’ research under threat of not being published because we were not in a position to grant access to the data.
This I do not wish upon anyone.
It is clear to me that the discourse is deeply complex, there are domain constraints, capabilities constraints, resources constraints, privacy constraints, and most of all national constraints with policies differing considerably from one country to the other (as an easy example EU GDPR now a different entity from UK GDPR).
As a community we have a wealth of research diversity which I wish to see thriving. This will mean looking into what are the opportunities in support of its development, what are the needs, and, if we are to develop policies, which ones can bring a concrete benefit, not just to the community as an abstract entity but to the individual researcher.
I strongly believe in incentives not constraints, in initiatives that sustain a researcher’s work, and would welcome tools and ideas that could support anyone wishing to work towards open science but do not have the resources to do so.
This is also embedded in something very important to me: respecting my colleagues’ freedom of choice.
We are already under enough pressure let us use our energy to support each other work and successes.
Stefan Bruckner
As I also wrote in my candidate statement (see http://ieeevis.org/year/2021/info/vec-candidates#stefan-bruckner), if elected, I plan to actively support and promote initiatives dedicated to furthering open science practices and transparency.
There are several measures that I think would be worthwhile investigating. In particular, I believe it is important to create stronger incentive structures for making source code, data, and other artefacts available to the community. Initiatives such as the GRSI are gaining some traction also in the visualization community, but at the level of VIS there is, at present, no explicit tie-in with such efforts (unlike for direct TVCG submissions, for instance). Furthermore, I think a dedicated award for papers that go significantly beyond what is required in this respect would be a good starting point to signal that such efforts are recognized and worthwhile. This also goes hand-in-hand with a better integration of questions of transparency into the review process.
However, I believe that it is also important to recognize that the partial reluctance towards implementing such policies is not due to malicious intent. For instance, as someone who has worked with collaborators in industry for many years, I know that it can be very difficult to get stakeholders on board with what some may see as potentially detrimental to their interests. We need to be conscious of such issues and make sure that all opinions on the topic are openly discussed and considered. As can also be seen from the measures proposed above, I believe a progressive effort towards encouraging, incentivising, and providing visibility to works that follow the best practices is the way to go here (as opposed to strong enforcement). I believe that our community has shown – for instance in the excellent work of the reVISe committee – that it is capable of driving forward dedicated efforts to renew itself substantially and I am confident that the changes in the governance of VIS will be a great contributor to make similar advances in open practices.
Michael Correll
As a researcher in communicating uncertainty and statistical concepts to non-statistical audiences, as a participant in organizations dedicated to statistical openness (such as the Transparent Statistics in HCI group), and as a frequent commenter on the epistemological and communicative conundrums at the core of visualization research, establishing thoughtful and rigorous norms around how we conduct and report on our research is a core concern of mine. Hopefully the centrality of this concern is reflected in both my work and my VEC candidate statement. I will use my response here to highlight some potential areas where I think we could (or should) change as a conference.
I will preface any statements about policy by saying that one of the things that draws me to the visualization community is that we come from many backgrounds and have a wide and varied view of what it means to do “good” visualization research. Any schemes to “improve” research in visualization must be undertaken from this assumption of plurality: standards that work for an author presenting a graphical perception study would look quite different than standards for an author presenting a novel rendering algorithm, or one presenting a close reading of a visualization on historical or aesthetic grounds.
With that said, I believe that we can look to other fields and conferences for examples of policies that work to encourage openness, rigor, and reliability. Both the disruptions to the usual way of presenting work caused by the pandemic and by new legislation such as the EU’s Plan S are long-overdue crises that will require us to rethink the publication model in any event, so why not build one that is more conducive to better practices from the outset? “Encourage” is the key word for me here; I think we will have more success providing positive reinforcement for high-quality research as opposed to introducing new, stricter, and potentially unfamiliar rules or punishments. Other conferences are taking big bets or performing big experiments around how work is submitted (rolling deadlines, revise-and-resubmit options, and rebuttals) as well as on the contents and structure of that submitted work (requiring statements of broader impacts, positionally, or data accessibility). As we learn more about the results of those experiments, we should borrow, adapt, or outright copy the things that work.
One particular form of encouragement I would like to see is a more expansive view of conference contributions other than “10 pages of static content due on a specific date and published in isolation in a special edition of a journal.” Preregistrations, registered reports, and replications are artifacts that can shore up the rigor of empirical work. Yet, these artifacts are currently rare or, at best, co-exist uneasily with the standard IEEE VIS reviewing and publication model. We need to make room for this sort of work at VIS. Providing official and explicit guidance, recognition, and promotion for these sorts of “non-standard” research artifacts would go a long way towards moving these practices from the periphery to the core of empirical work at VIS.
Helwig Hauser
Statement not yet received (request sent Sept. 17)
Filip Sadlo
Statement not yet received (request sent Sept. 17)