PhD 2023 Research Statement UW Public

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 3

RESEARCH STATEMENT SHAHAN A.

MEMON

I am drawn to study people’s behavior, and their interactions to understand, uncover and solve complex and
inconspicuous societal problems using large-scale datasets. These interests emerged from seven years of research
experience in various fields including computational social science, voice forensics, science of science, machine
learning, and network science. My choice of pursuing a PhD is driven by my commitment to pursue research as a
career. The freedom that an academic environment provides to realize one’s research goals is unparalleled, and I
believe that it is essential for me to pursue my research goals, and create a real impact in society. I also see research
as a collaborative enterprise. I have benefited greatly from the collective effort of the research community, in my
formative years, and I am eager to participate in this collective endeavor, and take part in mentoring students in the
future at an institution like UW.

My current research interests can be categorized into broad fields of computational social science, and information
science, with a focus on science of science, and online communities. My research goal is to understand the underlying
social and information dynamics of science and society, and to develop computational solutions to improve them.

MAKING SCIENCE BETTER


I have worked with four different research groups on about 15 different research projects. Every group I have worked
with has had a different culture, and a different way of doing science. Nobody really follows the “scientific method”
we learn in school. Some of our studies are observational, others are experimental. Some are descriptive or
correlational, others are causal. Some use surveys, others use large-datasets found in the wild. As Paul Feyerabend
said, “The only principle that does not inhibit progress (in science) is: anything goes”. But if “anything goes”, how and
when should one trust science or scientists for that matter? Historian of science, Naomi Oreskes, argues [1] that
science is a consensual process, and one should trust science when science has undergone “organized skepticism” [2]
or what we now call a peer review process from a diverse set of scientists. Science, however, is a consensual but
social process. So what about the social implications of science? Dr. Oreskes writes, “Scientists attempt to escape the
sting of these extra-scientific considerations by retreating into value-neutrality, insisting that while our science may
have [..] implications, the science itself is value-free.” The producers, evaluators, and consumers of science, however,
are all humans with values and biases. As such science cannot really be value neutral. But if that is the case, do we
really understand the social implications of science, its dissemination, and the institutionalized scientific processes?
Do we understand the societal biases that come into play when science is produced, evaluated, and disseminated?
These were the questions I was interested in studying when I joined New York University in Abu Dhabi (NYUAD) as a
research associate.

Soon after I joined NYUAD, a paper published in Nature Communications studying mentorship in academic
collaborations [3] triggered a social media backlash. The debate was largely focused on the disagreement around the
conclusions of the article and eventually led to its retraction by the authors. Being an active consumer of Twitter, I
was aware of all the events and discussions that took place online, and the negative “attention” that the paper
received from the public, despite no methodological errors having been pointed out. This presented a conundrum
regarding the social implications, and value neutrality of science, and sparked an idea about the “cancellation” effect
on the careers of retracted scientists. While we have institutionalized retraction as a corrective scientific process, do
we really understand its effects, and are those effects desired by the community? What is the role of online
“attention” on “cancellation”? I joined forces with Professor AlShebli and Professor Makovi to answer these questions
by studying the effect of retractions and online attention on the careers and collaboration networks of scientists. To
operationalize this, we merged three large-scale datasets: Microsoft Academic Graph (>263 million publications),
Retraction Watch (>26K records), and Altmetric (tracks over 35 million research outputs). Using a matching
experiment, we found that authors who have not experienced a retraction in their careers tend to gain significantly
more new collaborators and retain significantly more of their past collaborators with no heterogeneity in the effects
across different groups (gender, discipline, etc). After analyzing the career history of the scientists, however, we found
that about 35% of the authors stop publishing in their year of retraction. As such, if one accounts for this attrition all
Research Statement: Shahan A. Memon webpage: https://samemon.github.io/

the effects reverse leading to a possible explanation that “what does not kill you makes you stronger.” We also found
that high levels of online “attention” is positively associated with the end of an academic publishing career. I
presented this work at IC2S2 this year. We are currently in the process of preparing the manuscript for a general
scientific audience.

Science, and many of our institutionalized scientific processes, like retraction, are far from perfect. Changes in these
processes are sudden and often undiscriminating – case in point the recent eLife decision on removing rejections [4].
Therefore, understanding the prominence of human biases, and the validity of the fundamental assumptions that
underlie these processes is instrumental to upholding objectivity and/or desired standards in scientific research. This
is a line of inquiry that I would like to push forward as I pursue my PhD.

At UW, I am interested in Professor West’s ongoing work in science of science. Their recent work [5] on studying, and
questioning the assumptions behind the replication surveys is inspirational. I also find their work on studying gender
disparities in science [6,7] quite fitting to my research goals. My own research projects with Professor AlShebli in
science of science, and network science center on a variety of questions linked to inequality, relying on various
additional large-scale datasets. These projects include studying (a) US-China collaboration in AI (also with Professor
Evans at UChicago), (b) the role of gender, ethnicity, prestige and mentorship on the careers of Postdoctoral
Associates (also with Professor Rahwan and Professor Holme at Tokyo Institute of Technology), and (c) gender and
ethnic pay and career mobility gap (also with Professor Adnan and Professor Park). In my research, I am always drawn
to study the role of gender, race, and other dimensions of identity, as these factors often have great societal
implications. I believe quantifying inequality is important for changing minds, and also for implementing reforms to
governmental and organizational policies. As such, studying, and quantifying inequality in science is also a line of
research I am interested in.

UNDERSTANDING AND REDUCING MISINFORMATION


Online social networks have become a hotbed for the spread of false information. It is more important than ever to
identify ways to mitigate and debunk mis/disinformation in online platforms. It may not be possible to completely
eradicate it, but we can certainly minimize it. This requires both understanding of the different aspects of
misinformation, such as the provenance, the spread, the consequences, the engagement, as well as of corrective
experiments and frameworks to intervene. While I was a Master’s student at CMU’s Language Technologies Institute, I
worked on a related project, as part of my Master’s thesis, to characterize misinformed online health communities. I
used sociolinguistic and network analysis tools to understand how competing misinformation communities (eg. pro
vax vs. anti vax) differed in terms of their network and linguistic structures. I tapped into their communication
networks on Twitter, and found stark differences in the way these communities interacted. Furthermore, studying
different kinds of misinformation (COVID vs. vaccination), I found that the characteristics of misinformed and
informed communities did not necessarily generalize, owing to the inherent differences in the demographics of the
users who engage. This research was supported by a fellowship awarded to only 8 students in CMU in 2019 by the
Center for Machine Learning and Health (CMLH), and culminated in two well-cited publications in reputable
conferences [8,9] in collaboration with Professor Carley. As part of the project, we also developed a COVID-19
misinformation related Twitter dataset called CMU-MisCOV19 [10] which currently has 1,716 downloads on Zenodo.

Building on my prior work on misinformation, I am interested in studying the role of online interventions such as
public health messaging and social corrections. Corrections are an important mechanism for dispelling myths and
misinformation [11]. However, social corrections typically yield lower engagement, and understanding and designing
ways to promote engagement is extremely important. One of the main lessons I learnt from my Master’s thesis was
that, due to extreme polarization, there needed to be a mechanism for the informed and misinformed communities
to have a two-way constructive communication. Unfortunately, the way people communicate online lacks empathy,
listening, and understanding, preventing them from having a constructive dialogue. It is important to find ways to
Research Statement: Shahan A. Memon webpage: https://samemon.github.io/

encourage and promote civil and respectful engagement. Professor Saveski’s work on building methods and tools for
making online conversations more productive addresses these core issues. I find their work on the structure of toxic
conversations [12], and the role of perspective-taking on promoting empathy [13] to be inspirational, and essential
pushing this area of inquiry forward.

In regards to interventions, we also have limited understanding of the (long-term) effects of interventions, such as
corrections. Professor West, Professor Spiro, and Professor Starbird’s ongoing research on studying the
effectiveness of different kinds of interventions [14] is a step forward. Interventions and policy decisions also cannot
be generalized to all populations, and need to be locally contextualized within the target community. This requires
participatory design methods that involve the primary stakeholders in not just understanding, but in co-creation of
these solutions.

I am also interested in the nexus between science and misinformation, and how one affects the other. These interests
are interlinked with my desire to improve scientific communication, which is a novel and fast-developing area of
research. The scientific process of publishing that we adhere to does not cater well to the general public, making
science largely inaccessible. Some journalists lack the training in science, and often rely on their interpretation of the
scientific papers. Identifying ways to redesign scientific communication is imperative for a healthy dialogue, especially
around contested and often politicized issues as climate change.

Information science is an interdisciplinary field, and given my diverse research background, I am well-suited for a
career in this field. I am applying to UW iSchool because of its interdisciplinarity, and its strong community of faculty,
students, and alumni. There is one other important reason: it is that collaborations across departments (eg. HCDE,
CS) are commonplace at UW iSchool. Collaborations can bring diversity of thoughts, prevent blindspots, and can
create more impactful research. Collaborations are not just important epistemologically, but they are also key to a
successful career in academia. As my ultimate goal to build a career in the academy, UW iSchool’s PhD program is a
natural next step.

REFERENCES
1. Oreskes, N. (2021). Why trust science?. In Why Trust Science?. Princeton University Press.
2. Merton, Robert K. "The normative structure of science." The sociology of science: Theoretical and empirical investigations
(1979): 267-278.
3. AlShebli, B., Makovi, K., & Rahwan, T. (2020). RETRACTED ARTICLE: The association between early career informal
mentorship in academic collaborations and junior author performance. Nature communications, 11(1), 1-8.
4. Else, Holly. "eLife won't reject papers once they are under review-what researchers think." Nature (2022).
5. Bak-Coleman, Joseph, et al. "Replication does not measure scientific productivity." (2022).
6. West, Jevin D., et al. "The role of gender in scholarly authorship." PloS one 8.7 (2013): e66212.
7. King, Molly M., et al. "Men set their own cites high: Gender and self-citation across fields and over time." Socius 3 (2017):
2378023117738903.
8. Memon, S. A., Tyagi, A., Mortensen, D. R., & Carley, K. M. (2020, October). Characterizing sociolinguistic variation in the
competing vaccination communities. In International Conference on Social Computing, Behavioral-Cultural Modeling
and Prediction and Behavior Representation in Modeling and Simulation (pp. 118-129). Springer, Cham.
9. Memon, S. A., & Carley, K. M. (2020). Characterizing covid-19 misinformation communities using a novel twitter dataset.
6th International workshop on Mining Actionable Insights from Social Networks, CIKM.
10. Memon, S. A., & Carley, K. M. (2020). CMU-MisCov19: A Novel Twitter Dataset for Characterizing COVID-19
Misinformation [Data set]. 5th International Workshop on Mining Actionable Insights from Social Networks (MAISoN) at
CIKM 2020, Online. Zenodo. https://doi.org/10.5281/zenodo.4024154
11. Bode, Leticia, and Emily K. Vraga. "See something, say something: Correction of global health misinformation on social
media." Health communication 33.9 (2018): 1131-1140.
12. Saveski, Martin, Brandon Roy, and Deb Roy. "The structure of toxic conversations on Twitter." Proceedings of the Web
Conference 2021. 2021.
13. Saveski, Martin, et al. "Perspective-taking to reduce affective polarization on social media." Proceedings of the
International AAAI Conference on Web and Social Media. Vol. 16. 2022.
14. Bak-Coleman, Joseph B., et al. ‘Combining Interventions to Reduce the Spread of Viral Misinformation’. Nature Human
Behaviour, vol. June 23, 2022, https://doi.org10.1038/s41562-022-01388-6.

You might also like