Research talk:Harassment survey 2015
This talk page is for discussion of the Harassment survey 2015, not discussion of the consultation on the same topic. If you have comments or questions about the consultation, please share them at talk:Harassment consultation 2015. |
Move to Research:Index?
Can we move this page to Research:Index? This is a research project and should probably live there. The purpose is to keep all research organized in one space. I realize this might add a bit more work, but this is important for long-term documentation of the work that we do as a movement. Thanks! --EGalvez (WMF) (talk) 20:36, 2 November 2015 (UTC)
- Also, I'd be happy to contribute to the page if/once its moved. Thanks --EGalvez (WMF) (talk) 20:37, 2 November 2015 (UTC)
- There seem to be far more surveys in mainspace, but I have no objections, if there's a move to reorganize surveys to the Research namespace. If you want to, go ahead, Edward. However, the consultation should not move. Please leave a redirect if you do move the page, because otherwise people coming to it from the email especially will be lost. :) --Maggie Dennis (WMF) (talk) 20:42, 2 November 2015 (UTC)
- Thanks, talk | Maggie! - Will move and leave redirects --EGalvez (WMF) (talk) 20:46, 2 November 2015 (UTC)
- I remember suggesting this to Patrick, but I might have forgotten to CC everyone else. Btw, the research index is still searchable by default. --HaithamS (WMF) (talk) 01:07, 3 November 2015 (UTC)
- The move seems to be uncontroversial, as long as you leave redirects. :) I myself am comfortable with where it is, but if you're not and you're familiar enough with the Research space protocols to be sure it's welcome, have at it! --Maggie Dennis (WMF) (talk) 15:24, 3 November 2015 (UTC)
- I remember suggesting this to Patrick, but I might have forgotten to CC everyone else. Btw, the research index is still searchable by default. --HaithamS (WMF) (talk) 01:07, 3 November 2015 (UTC)
Design fault
There was a harassment survey on a pagetop banner today. I started to answer it. It might be a good idea, before doing any more such surveys, to do a survey on why people don't complete surveys. There's a basic design fault.
Here's a hint on how to do it better: I hope it's useful! If answers are wanted, allow people to submit their answers when they get tired of going round the "You must answer this question" loop; because if the only way to end the loop is to close the window, that's what they'll eventually do. If answers are not wanted, don't do the survey ...
The system didn't give me any clues, but I believe that the questions I couldn't answer were "Other: please specify", twice. Anyone who can tell me what the answer to that question would have been is quite a philosopher.
It wasn't easy to find this page -- that's another design fault, I'd say -- so I have meanwhile posted this comment at en:Wikipedia:Harassment. Andrew Dalby (talk) 09:55, 3 November 2015 (UTC)
- Hello Andrew, thank you for raising your concerns. There are currently a few questions in the survey where you are required to provide an answer for every statement, including the statement 'Other'. In those questions, you can select 'never' or 'not applicable' as a response for 'Other'. This will allow you to move to the next question. Participants are not able to go back to questions because the question order helps them keep context in mind when responding. We will explore how we can improve the experience though, based on your comments. So, thank you for sharing them. Kalliope (WMF) (talk) 14:14, 3 November 2015 (UTC)
- The page with sliders on which forms of harassment I have experienced would not advance until they were all non-zero. I selected a distinct number and added a note to say that actually meant zero. Burninthruthesky (talk) 16:39, 3 November 2015 (UTC)
- I'd like to say that I also found having to answer a question marked 'Other' completely unintuitive; it did not occur to me to select 'not applicable' without writing anything into the box. Bilorv (talk) 18:00, 3 November 2015 (UTC)
- I concur that requiring "other" be answered and requiring sliders to be nudged before they will register zero are both poorly designed. Figuring out how to make them work was a waste of time for me. For some this must have been a non-starter that led them to give up. ~ Ningauble (talk) 18:44, 3 November 2015 (UTC)
- I tried to do the survey today, and gave up after the first question, which asked how often I contribute to a "Wikimedia project". I didn't know what a "Wikimedia project" was, so I tried to find out, failed, and gave up. I have since learned that en:Wikipedia counts as a "Wikimedia project". So the very first question is enough to weed out all but the most experienced users. (A subsequent attempt to find the survey took me to the mystifying [1].) Maproom (talk) 18:05, 3 November 2015 (UTC)
- Thank you for letting us know Maproom. The first question is now linked to the list of Wikimedia projects. Any users unsure what those are can now quickly refer to it. I hope this will be helpful. Kalliope (WMF) (talk) 10:27, 4 November 2015 (UTC)
The tech issue on the multi-statement questions (like the one with the sliders) should be now fixed and no longer forcing participants to enter a number in order to progress. You will still get a reminder that "you have not answered all questions/statements" if some of the statements are unanswered but it should no longer prevent you from moving on to the next question. Do keep in mind that it is still not possible to move backwards into the survey. So, if you chose to leave statements unanswered, you won't be able to return to them later. Thank you all for raising your concerns as this did not appear to be an issue during the testing phase, before the survey's launch. If you become aware of other such glitches, do let us know.Kalliope (WMF) (talk) 09:15, 4 November 2015 (UTC)
- Thanks very much for your responses, Kalliope. I believe the designers and testers of surveys of this kind don't realise how many potentially good responses they lose half way. It's for a similar reason that I have never managed to complete a survey on The Guardian website: I'd love to respond to them, it's a great newspaper, but I don't eventually have the patience to work out what it is that they think I am failing to do. Andrew Dalby (talk) 14:02, 4 November 2015 (UTC)
Meaning of "disclosing" identity
There was a question near the end of the survey where it asks about how often the surveyed person "discloses" various aspects of his/her identity. Now I use my real name as my account name, and provide a link from my user page (on at least one wiki, not meta) to a social-media page where I disclose religion, gender, location, etc.; but I don't have any identity or POV badges on my user page, and so far I've never qualified a commit comment or discussion participation with "I have a COI with subject X because my identity is Z", or anything like that. Furthermore, my name provides clues to my gender and ethnicity. Does that count as "always" or "never"? If someone used his or her real name but no outside links, and a Google search on that name would turn up identifying information, how should that person answer the questions?
So that part of the survey might be better with an "N/A" option, or gradations of directness within "Always". DavidLeeLambert (talk) 16:25, 3 November 2015 (UTC)
Agreed. It's not like I've gone out of my way to explicitly make any of those things clear, but I'd imagine some of them are obvious given the way I go about my business...I, too, use my real name.--MichaelProcton (talk) 16:54, 3 November 2015 (UTC)
- There are direct and indirect ways of disclosing private information about you. For example, if you are using your real name as your WP username, you don't need to link it to your social media profiles. No matter how easy or difficult it is for somebody to trace you through google searches, it remains a fact that your real name is out there in an obvious and hard-to-miss way (:others will always know your name by looking at any of your activity in the projects). So, this question would warrant 'always' as a response. A more indirect way to disclose information, such as gender, is to use words that change format depending on the gender they refer to or, are gender-specific. For example the username 'moonriddengirl' indicates that the person using it is female through the word girl. So, this would also warrant an 'always' response. If nothing in your username or user page indicates where you are from but you have at times shared through discussion pages this information, then this would be a 'sometimes'. Once it's out, it's technically out. But one should also consider the degree of difficulty (or ease) in finding that piece of information and spreading it or using it against you.Kalliope (WMF) (talk) 09:31, 4 November 2015 (UTC)
Content-free page
I saw this mentioned at enwiki and looked here to get some information, but there is just market-speak. What survey? Is it by invitation, or is there a link I have missed seeing? Johnuniq (talk) 01:31, 4 November 2015 (UTC)
- What is "market-speak"? --FeralOink (talk) 15:52, 7 November 2015 (UTC)
- Hello, Johnuniq. :) It's a link via banner which displays randomly to a percentage of a project's users and which is being sent to a random selection of dormant contributors. It is starting on smaller wikis first and working its way towards larger projects, culminating on English Wikipedia. We have a limited number of survey responses we can accept, and English is likely to overwhelm that number fairly quickly. It will run until we run out (or for two weeks, in the very unlikely circumstances that we don't). --Maggie Dennis (WMF) (talk) 01:38, 4 November 2015 (UTC)
- Thanks but my point is that it is very irritating that there is no indication of that on the page; that leads to people like me wasting time by carefully examining every link to see what I'm missing. There should be a community communications team who can check that information pages actually provide information. Johnuniq (talk) 02:28, 4 November 2015 (UTC)
- You raise a good point, Johnuniq. I created the page, and I'm sorry if it disappoints. But it's a wiki page and changeable, so I'll add those details now. --Maggie Dennis (WMF) (talk) 02:33, 4 November 2015 (UTC)
- Thanks but my point is that it is very irritating that there is no indication of that on the page; that leads to people like me wasting time by carefully examining every link to see what I'm missing. There should be a community communications team who can check that information pages actually provide information. Johnuniq (talk) 02:28, 4 November 2015 (UTC)
Reporting harrasment
Responding to the survey, I think I have some stories for CA team to be aware of a long-term harrasing user. But I failed to find relevant information from CA. Could you enlight me? Is it cawikimedia.org? Not being sure, posting here... — regards, Revi 12:57, 4 November 2015 (UTC)
- Hi Revi. Particular harassment or abuse issues should not be posted here as this discussion page is neither a reporting mechanism nor a resolution process. The CA team can indeed be contacted directly through the email address that you mention, also found at the very bottom of the CA page.Kalliope (WMF) (talk) 14:54, 4 November 2015 (UTC)
- I know ;) I was just double-checking since I was in doubt. Thanks for confirming. — regards, Revi 16:08, 4 November 2015 (UTC)
User page vandalism
I was quite surprise to see user page vandalism as a part of harrassment. It is usually a form of vandalism and very, very rarely a way to harrass people. Just look at Jimbo's page history to see that his page was vandalised at least once a week, but hardly anyone wanted to harrass him. And the real harrassment (where a person vandalises userpage to harrass someone) on a user page is double counted, as it always falls under using obscene words / discriminations based on some identity / threats / trolling, or at least is a form of stalking. I would suggest not counting this line as it is irrelevant (or at most is a proxy of counter-vandalism activity of a user) — NickK (talk) 17:56, 5 November 2015 (UTC)
- NickK, thanks for this feedback, and for taking the survey. This may be a good point to raise during the upcoming consultation as well. With this survey, we're trying to track both the forms that harassment is taking, and the method/medium/mechanism used to deliver it. I agree that most user page vandalism takes the form of blanking, template-breaking, or silly/nonsense text, but some of it is directly intended to insult or threaten the page owner. So we did want to track this as well. Patrick Earley (WMF) (talk) 18:25, 5 November 2015 (UTC)
- Certainly userpage vandalism is used to harrass. I have seen many examples. As to whether vandalism on Jimbo's page is intended to harrass him (the example used by NickK), well, I guess Jimbo would be better placed to give an opinion on that. Andrew Dalby (talk) 10:16, 9 November 2015 (UTC)
Is harassment a global or a local problem?
This survey seems to be treating harassment as a universal problem that is present in more or less the same extent across all Wikipedia editions. My experience tells me this is not nearly true: some wikis are much worse than others. I have a concrete wiki in mind, but the survey is not interested in finding out which one, as it is not the wiki where I edit the most. In the end, everything will be averaged, and input from en wiki editors (en wiki is not perfect but is still a paradise compared to some wikis), who are the most numerous, will drown everything else. While the WMF is obviously aware of harassment as a universal problem, it refuses to deal with the concrete issues and complaints. Surveys do not solve anything; sometimes, one needs to check where the stink is coming from. GregorB (talk) 20:15, 5 November 2015 (UTC)
- I had a similar issue. I edit/contribute on several projects (more and less frequently) and found it hard to make over-all statements. It would be ideal if one could do the survey for every project, but that would not be very time effective. I disagree with "surveys do not solve anything". They do provide awareness but I wonder how project-specific this awareness can be. The more the better.--Jetam2 (talk) 22:20, 5 November 2015 (UTC)
- What I meant with "surveys do not solve anything" is not that surveys are generally unnecessary or useless. It's just that I don't understand why a survey which is specifically about harassment asks me where do I edit the most (en wiki, thank you for asking...), instead of where I'm harassed the most. Of course nobody will be happy editing where they are harassed, and will move someplace else. So, this question, instead of detecting problematic wikis, will detect wikis which are not problematic (or, more likely, detect nothing). To me, it's as if forming a question this way indicated a lack of concern about the root problem, which is toxic communities. And, since there is seemingly no interest in root causes, I'm afraid that the only outcome of the survey will be generalized statements such as "yeah, people are not perfect, let's introduce more policies". GregorB (talk) 11:22, 7 November 2015 (UTC)
copy-paste mistake
There is a duplication in the question of how long the hassassment lasted. The second-last item should probably read 'more than a month but less than a year'. Cheers, Pgallert (talk) 08:03, 6 November 2015 (UTC)
- Corrected. Thank you for letting us know!Kalliope (WMF) (talk) 14:00, 6 November 2015 (UTC)
A few more notes
The iniciative is to be applauded and I do so. I was in fact thinking of a similar one on my home wiki. On the other hand, I think that the survey could be made more Wiki-specific. The survey does not work so well for a (predominantly) online community like ours. Perhaps a question on harassment could be included in post-event feedback after offline meetings. Sexual harrasment is a serious issue. In our community, however, I would not expect it to happen. Not that we are awesome that way more because there is close to zero personal (off-line, face-to-face) contact between Wikipedians. Online-type harassment happens much more often: escalating discussions, name-calling (mostly related to educational underachievement, mental health and similar topics), editing wars, reverting edits without explanation, vandalizing/blanking of personal discussion pages or, the other extreme, ignoring a person totally. These types of harassment could have been studied in more detail. Another aspect that could have been included in the survey was the responses of different types of users. An admin has more tools and responsibility to deal with harassment than an editor. Blocking users that harass, for example, was not mentioned in the survey. Some Wiki projects are more global and international (English Wikipedia, French Wikipedia, maybe Arabic Wikipedia?), others are less so (Slovak Wikipedia etc.) In my experience, there is sometimes a homeland vs diaspora attitude that can become a cause of harassment or, at least cause feelings of being unwelcome. Some languages are gendered. Unless a person chooses not to participate in discussion, it is very hard to avoid revealing a gender. Having project- or language-specific surveys would be a good step but I also understand there are time and staff limits.--Jetam2 (talk) 14:48, 6 November 2015 (UTC)
- Jetam2, thank you for this. This is the first survey on this topic in the Wikimedia movement (that I know of :), and there is always room for improvement. We hope to do follow-up surveying in the future, and you input is very valuable for deciding what changes we can make in the data being collected. Patrick Earley (WMF) (talk) 19:30, 6 November 2015 (UTC)
- So, exactly how much money is WMF wasting on this project? Doesn't Wikimedia already have plenty of conflict resolution mechanisms? Do they even need my donation anymore? XavierItzm (talk) 08:05, 7 November 2015 (UTC)
Selection bias?
When I logged onto Wikipedia, a banner appeared asking me to participate in a survey on harassment. This, I'm afraid, will produce selection bias. Someone who believes that they have been subjected to harassment, or who's witnessed what they regard as the harassment of others, will be strongly motivated to respond to the survey; someone who's never felt that they've been harassed will be less likely to respond.
At the "Research" page, it's stated that "Community Advocacy has been advised that prematurely releasing the questions may bias the survey results." I'm afraid that the description on the banner will itself tend to produce a bias toward overestimating the incidence of harassment. It may be too late, but could I suggest that the description be changed to a more neutral phrasing, e.g. "participate in a survey", with no mention of the survey's topic? — Ammodramus (talk) 15:52, 6 November 2015 (UTC)
- Even worse, the two groups that would be most interested in participating will be: a) those who want WMF adopt serious measures against it, b) those who think that WMF should not act against harassment and leave it up to local communities. Unfortunately, there is little chance we will have participants in the middle, i.e. those who are not that concerned with the topic — NickK (talk) 17:26, 6 November 2015 (UTC)
- Ammodramus, NickK - Sampling bias is something we have have certainly considered when constructing this survey. We have used banners because we don't have any other data to segment users based on their relation to harassment, thus we are aware of the homogeneity of the response influx pool, and it's addressed as a conscious bias in the survey input.
- In terms of the description of the survey in the link, removing the word "harassment" from the banner wouldn't make much difference, as people will still see the survey topic in the first page in Qualtrics. They still have the chance to close it without responding to it, thus removing "harassment" from the banner would likely only drive people by curiosity to click on the link. Curiosity can create a bias as well as motivation.
- In short, each sampling method is going to introduce some error margin, and we've chosen what we thought is the best in this case. We've assessed the risk and outcome of each of the sampling methods, and we decided to go with what we believe will bring the most reliable data from the survey. Patrick Earley (WMF) (talk) 19:25, 6 November 2015 (UTC)
- There's nothing wrong with engaging those who have strong feelings about a topic. That's just how a democracy should work. In this instance it's really better than a random sampling.
- Wikipedia has major issues with suppression of opinion. All too often legitimate criticism of administrator actions is falsely labeled as AGF, threat, or harassment. Most of the administrators of today came in under a time of extraordinary growth and were not selected based on their ability to be good cops or judges. Another problem is the the English ARBCOM does not examine points of fact or allow public viewing of their BASC discussions. Final point. The doctrine of NOTTHEM is flawed. In some circumstances YESTHEM arguments are legitimate. PhanuelB (talk) 16:26, 11 November 2015 (UTC)
Survey Design
Moving through the questions, I found manybasic design problems in the manner in which questions were worded and answer choices were worded or given. I'm fairly certain there are other WP users who have relevant social science survey design experience and expertise. I'd strongly suggest you tap some of that expertise for your next site-wide survey. Meclee (talk) 22:42, 6 November 2015 (UTC)
- I am going to agree with Meclee here. There is an enormous gap between "once a day or more" and "once or twice a week". Many of my answers would fall into that gap. I'm not sure I can answer the questions with any degree of reliability given this. The question "How many times have you experienced incidents like the ones described below while working on any of the Wikimedia projects?" is not entirely clear; much of the problematic behaviour can happen off-wiki (e.g., IRC, emails, mailing lists, other public forums) because of work on a Wikimedia project, but it's not clear whether or not you want those events included or excluded. The sliders on the third page are not appropriately set, and give the impression that getting called a bad name once is equivalent to being stalked once, and realistically stalking doesn't fit into the list because it is almost guaranteed to include significant off-wiki activity. "What was the harassment based on?" doesn't include some of the most obvious and common reasons for harassment, which include "content dispute" and "movement-related role" (e.g., administrator, steward, chapter executive). "How did the witnessing of another community member being harassed affect your own participation to the Wikimedia projects?" does not consider the possibility that users continued to contribute while contemplating stopping. The questions in the personal profile section that list what personal information one might share does not have an option "I share it with some people but not others" - which is probably amongst the most frequent answers. Finally, there's no back button on the survey, which means people cannot go back to correct responses without starting all over again.
On the other hand, I'm very happy that the WMF is actively working to collect information in this area. One hopes that the information will lead to actionable plans (actionable both in the sense of "something that can be done" and "the resources are put in place to take actions"). Wikimedia can't patrol the internet (i.e., activities outside of venues that are under WMF control), and it may become clear that there are some pretty divergent ideas on what does and does not constitute harassing behaviour, but this is a good first step. Risker (talk) 06:34, 7 November 2015 (UTC)
Firefox 41.0 with Adblock user here. I confirm that the survey layout itself was all messed up. There were no tables as such, and the strings were shifted. I had to mentally recreate which radio button was meant to be. Also, many times I got stuck, as the radio buttons did not cancel one another. I think the survey engine should be changed. Zezen (talk) 14:43, 7 November 2015 (UTC)
Add "prefer to skip this question"
All questions should have an explicit "prefer to skip this question" choice. Davidwr/talk 22:56, 6 November 2015 (UTC)
"Other" button
Several questions have an "Other - please specify" option, but if one has nothing to put in the other category the survey still makes you check a box by other. ONUnicorn (talk) 12:56, 8 November 2015 (UTC)
- The survey was initially set so that if one had to select a response for the *Other* statement in grid questions but had nothing to add in the text box, they could select 'Never' or 'Not applicable'. As a few participants found this confusing, all grid questions with such an option have been set so that they no longer prevent the participant from progressing to the next question(s). In other words, you no longer have to select something for the *Other* statement. You will get a reminder letting you know that you have skipped a question if that's the case, but this will still allow you to continue. Kalliope (WMF) (talk) 12:12, 9 November 2015 (UTC)
Qualtrics
I use the "Ghostery" browser plugin, which prevented a redirection to the survey in the first place. I didn't even start the survey, because you use the services of a private company which is obviously active in the field of data collection and analysis, but over which almost no information is available. The Wikipedia article is a stub, and you expect users to trust this service? The surveys conducted through Qualtrics will be biased, because privacy-aware and generally suspicious users are likely to not even look at the first question. This is also a design fault. The WMF should spend the money for the development of their own survey software instead, which would make the projects more independent and provide better protection for our data. --CHF (talk) 22:08, 9 November 2015 (UTC)
- Hi, CHF. Thanks for your feedback. I'm sorry that you weren't comfortable completing the survey. The invitation links to their privacy policy specifically because we wanted users to be able to see what their practices are. It would be fantastic if we had in-house built survey tools, but unfortunately we currently don't. We really want feedback on this issue, though. Hopefully you will feel comfortable talking about the issue in general in our upcoming consultation which we intend to launch next Monday at Harassment consultation 2015. We were going to run the survey and consultation simultaneously but were advised that this would bias our survey results, so we had to delay the consultation component. --Maggie Dennis (WMF) (talk) 23:30, 9 November 2015 (UTC)
Language, focus - get help - better help
You would think that a survey started due to gross incaution with words would be put together with the utmost care. I did not get that impression at all.
In one case I was stunned at an omission. For the question (paraphased) "how did you react to witnessing harassment", where was the answer "I edit less than before" or "I now edit defensively"? There was "I walked away entirely". There was "but I came back". There was "my name is Pangloss". But not "I was depressed at the level of acceptance of ongoing nastiness, and so naturally repelled, felt less desire to participate, though I keep coming back because I love the WP idea." How much review did these questions get?
Again, in this too frequently poisonous climate of modern life, where respectful discourse is preconditioned on how reasonable "the other side" is, how could you not yourselves monitor your own speech? Do you have any idea at the alarm bells started by calling something a 'movement', as you mentioned in one page? You may know what you mean. I get the inkling it may mean WP/WM in general. But 'movements' are those things where social action / social justice groups exercise domination ostensibly to some societal good, but employing mobs and shaming, and chiefly to maintain themselves. Did you mean that kind of 'movement'?
You have not been careful enough here, and need to broaden the people reviewing and finding those 'unforgivable' mistakes. Like calling a trough a spade. Not giving a fig can have great repercussions. (And if you don't know what that refers to, ask someone who knows) Shenme (talk) 01:19, 11 November 2015 (UTC)
- Hi, Shenme. I'm sorry that you felt the language was inappropriate. The questions received extensive review, from staff, our community workgroup and the external experts we consulted. In terms of "movement," it's fairly commonly used to describe our work - see Wikimedia movement affiliates and wmf:Wikimedia Movement Strategic Plan Summary, for two examples. It is linked in the sidebar on Meta as "Movement affiliates." I was unaware that the term was controversial, but its use in the survey reflects its usage ins uch locations as that. --Maggie Dennis (WMF) (talk) 01:27, 11 November 2015 (UTC)
Wondering how many others have seen "elitism" in reverts
as i did the survey it brought up the anger of every reverted edit.... now my edits often take hours due to being disabled and a slow typist, but often these hard thought about edits end up being deleted in a single click of the revert button and some 10k editor has taken two seconds to eradicate hours of work and record his 10,001st edit
(it is my understanding that a revert counts as an edit for the one that clicks the revert link and the revert also removes the edit from the reverted authors edit tally, please correct me if i have misinterpreted what i have read)
it might do for those interested in motivations of wiki participation to read up on game motivation i found Bartle's Player types fascinating and upon thinking about it seemingly on point. (one should note that even Wikipedia awards badges as well as contributor status not to mention the list of superuser permissions that are given to users to be more productive in titled roles such as moderator and administrator thus the extras like badges, titles and extra permissions and tools used are all "rewards" in the parlance of Bartles Taxomony of Player types and game theory making the editing of a wiki a game just like the board game "Risk")
Before someone says wiki is not a game, i know that. but editing a wiki acts just like a game and i purposely used "Risk" as an example. you can stay in a small area of expertise or "Explore" by clicking the "Random Page" link those who focus on single area often become the experts in that area making them the "killer" according to Bartle. some are more social while the "achievement killers" tend to find an insignificant error and then revert with a terse rule reference (Rule Nazi's) (has anyone been frustrated by someone who spent hours memorizing the rules so they catch you playing Risk phase three before you are done doing your phase two but "Rule Nazi" says you cannot finish phase two because you started phase three on the other side of the board.
there are so many other parallels but as i have pointed out it is hard (even painful) for me to tyle for long periods so i will revert to my main question...
Please give examples of "elitism" you have encountered in wikis (not just Wikipedia products but other wikis also) and any ideas you have to prevent it from discouraging others like you reverts are certainly a key used by this sort but a rewrite that changes the meaning could be just as bad (do not include edits that add info or that changes structure but not the meaning unless there is evidence that the re editor did so with an air of "you have no idea what you are doing but i know everything") remember that a wiki is supposed to be edited and reedited
While my reason to add this question is to get an idea of the extent of lazy editors reverting instead of editing for clarification, i want full feedback on anything you think was motivated by "Elitism", the reason(s) it felt elitist and especially any ideas you can think of that will reduce or eliminate the problem in the future
in the survey i asked for a rule change and a procedural change to verify the rule change (limit reverts to blatant vandalism and require a revert to be verified by multiple veteran editors) i also thought about other checks and balances like a time limit that only an edit can occur (to encourage editing rather than reverting) but that doesn't feel as good as the verify option (besides any edit only time span wouldn't give me enough time unless it was at least a week, i just don't check wiki that often) Qazwiz (talk) 07:26, 11 November 2015 (UTC)
Murder of Meredith Kercher Article
I just became aware of this survey. I think it's great that the WMF is willing to take a look at the subject. The problems here are worse than the community generally recognizes.
I hope that the organizers of this survey will take a hard look at the events surrounding the Meredith Kercher topic and its treatment of Amanda Knox and Raffaele Sollecito. There has been some measured improvement in the topic since the acquital of the two in March 2015 but the fact remains that many editors who broke few if any rules were blocked because they challenged a deeply troubled article. The use of false allegations (including harassment) was a central element of the events.
Here is an article I wrote which documents many of the flaws in the article. Please note that Jimmy Wales has made extraordinary statements about the article and the associated blocks.
Please take note of this proposed presentation at the WikiconferenceUSA last August. Toward the end one of the commenters, Dominic, points out that I had been banned for threats and harassment. The allegations of harassment and threats are manifest nonsense and were part of a concerted campaign by British administrators convinced of Knox and Sollecito's guilt to eliminate all those who challenged their treatment of the two.
Notes:
- (1) RS excluded from the article
- (2) Commentary by Jimmy Wales
- (3) Criticism of the article by RS and retired FBI agent Steve Moore
- (4) Signpost article talking about the dispute
- (5) Article on Hate Site by a Wiki administrator who has implemented many blocks on the topic
- (6) British tabloid reporting
- (7) Threats made to family of RS Nina Burleigh by Peter Quennell, Webmaster of TJMK
- (8) Presentation to a Wiki Meetup in NY
- (9) Article detailing the dichotomy between the article's administrators, RS, and Jimmy Wales
- (10) List of Editors blocked on the topic
PhanuelB (talk) 14:46, 11 November 2015 (UTC)
- Hello PhanuelB, thank you for your contribution. Please note that this page is not a place to report incidents of harassment, dispute actions taken (or not taken), discuss specific incidents of harassment or dispute specific article content actions. Rather it is to raise issues or concerns about the survey itself in terms of content, translations errors, formatting errors, design, approach, methodology, things missed, etc. If you wish to report a specific incident or raise your objection to actions taken, I would advise that you use the appropriate channels. Otherwise you are welcome to follow the discussions soon to take place at the Harassment consultation 2015 page, which will be open from November 15th onwards.Kalliope (WMF) (talk) 14:43, 12 November 2015 (UTC)
Translations
In the German version of the survey, one section is not translated:
- Being treated differently/unfairly based on personal characteristics, instead of merit [Discrimination]
Unfair is translated with "unfähr" (instead of "unfair" like in English). --Martina Nolte (talk) 20:54, 14 November 2015 (UTC)
- Thank you Martina Nolte. Corrected. Kalliope (WMF) (talk) 11:57, 15 November 2015 (UTC)
Witnessing others being harassed
On the third to last page of part II you ask whether the participant has witnessed others being harassed and supply as possible answer ‘never/not sure’ (quoted from memory as it is not possible to return to a page once you’ve gone on). On the following pages you then force the participant to state in what ways they responded to this harassment, not supplying as possible answer ‘not applicable’ and thus forcing them to give nonsensical answers if they want to continue filling in the survey. --JaS (talk) 14:54, 15 November 2015 (UTC)
- @JaS: I also noticed this: an odd mistake. --Rubbish computer (HALP!: I dropped the bass?) 01:07, 4 December 2015 (UTC)
Survey results suggest flaws in design and execution
I have been challenged by @User:GorillaWarfare on Twitter to generate a discussion on Meta to review potential flaws in the design and execution of this Harassment survey, prompted by some of the hard-to-believe findings in the report.
An initial glaring problem (to me) was the finding that 54% of Wikimedia (Wikipedia, Commons, Wikidata, etc.) users said that they definitely or maybe (they didn't know for sure) have experienced harassment on these sites (Figure 13), of which 61% said that the harassment took the form of "revenge porn" (Figure 16). So, netting those together, it means that just about 33% of all Wikimedia project users have been or may have been personally victimized by revenge porn on a Wikimedia project site.
Now, let me say this. GorillaWarfare has been victimized by pornographic image associations with her name and/or image, through photoshopping and the like, but (as far as I know) she has not been victim of actual "revenge porn", which Wikipedia defines as "sexually explicit footage of one or more people distributed without their consent via any medium. The sexually explicit images or video may be made by a partner of an intimate relationship with the knowledge and consent of the subject, or it may be made without his or her knowledge". The key difference being -- is the image actually of the person in question, or is the image doctored to look like the person in question? While I am sure that across the vast number of users of Wikimedia projects, there may have been a half-dozen instances of a Wikimedia user being depicted in sexually explicit footage, and then found another Wikimedian distributing that content without the subject's consent, it is incredulous to believe that one-third of Wikimedians have been thus victimized. If that were the case, it would be on national nightly news for weeks running.
What I believe is happening here, is that respondents to the survey are taking the opportunity to "check yes" to things that come close to what happened to them, or that they heard have happened to others around them, by which the survey then attributes these specific things actually and literally happening to everyone who "checked yes". Another example, one verbatim from the study results says: "Had an explicit pornographic website created based my username". That's horrifying, demeaning, and defeats the human integrity of the victim. But that's not "revenge porn", and I will bet $10 that the respondent who typed that specific example marked that they had been subject to "revenge porn".
There are numerous other contradictory findings from this survey, which we can continue to discuss here. For example, Figure 28 reveals data that is impossible within a sample tallying to 100%. How could 56% say that they "did not react / ignored" the incident of harassment, while another 50% said that they "discussed it with other community members"? If you did not react, you can't have also discussed it with others. So, at least 6% of respondents to this question misstated something. Or, there are other ways to interpret the discrepancy -- they may have ignored one act of harassment, and then discussed a second act of harassment; or, they initially ignored an act of harassment, but later decided that they felt the need to discuss it with the community. Either way, it's an example of confusing or misleading survey design. You want to construct a survey so that there is not a strong chance that the data will come out in a misleading way or make the respondents look like they don't know how to answer a seemingly straightforward question. This is a challenge for all survey writers, and professionally I've written at least 2,000 surveys in my career. Outside of a few people on the WMF staff, I may be one of the most qualified survey designers who has ever volunteered to help the WMF with survey design, but they don't seek my assistance when new surveys are cooked up. And when I've stepped in before (such as at Talk:Fundraising 2009/Survey), it seemed that the WMF staff leader of the study didn't have any time to respond to the community's input.
I don't want my criticism to be taken as "Well, Mr. Kohs doesn't think there is a problem with harassment on Wikimedia projects". There is absolutely a huge problem with it. I myself have been victim of numerous ongoing forms of harassment -- even an off-Wikimedia wiki site (which had and has heavy participation by Wikipedia regulars, even some administrators) lampooning my sex life, illustrating a "FleshLight" as my favorite toy, and questioning whether my daughter is even my own biological offspring. So, I know what it's like to be a victim of online harassment. On the flip side, I have pursued information about various Wikimedians that they have in turn interpreted as "harassment" actions. For example, when one Wikimedian who had numerous photos of herself on Commons was accused by another Wikimedian that she physically threatened him at a conference, when I copied a couple of her Commons photos to identify her as the accused, suddenly her ID badge on her photos was blurred out, and then shortly after that, all of her photos were summarily removed from Commons. I understand through back-channels that she views my action of copying her freely-licensed images as "harassment", when all I was trying to document is that when Wikimedians are caught doing something embarrassing, the insiders' first instinct is to hide the identity of the person who messed up. Not the hallmark of an "open" and "transparent" community.
If the Wikimedia Foundation is going to tackle important issues like harassment of participants on the Wikimedia sites, then they owe it to us to design surveys that produce credible results, not exaggerated and misleading results. - Thekohser (talk) 13:39, 1 February 2016 (UTC)
- Thekohser, I don't know if this will help you, but the questions from the survey show that some allowed for multiple answers. The confusion you mention in adding up to 100% is due to respondents replying with multiple answers (I believe it was question #13 that Figure 28 came from). CKoerner (WMF) (talk) 16:10, 1 February 2016 (UTC)
- I do already understand that question #13 was a multiple response question, but the WMF set up the list of possible answers with one mutually exclusive item -- if you "did not react / ignored" something, you cannot have also taken any of the other actions listed. So, with 56% saying they "did not react / ignored", that means that the largest possible tally for any of the other ACTIONS would be 44%, the complement to the 56% for INACTION. Do you see what I am saying? It has nothing to do with the fact that it is a multiple-response question. That was implicitly understood, even in my critique. - Thekohser (talk) 16:18, 1 February 2016 (UTC)
- Thank you for your comments Thekohser. Let me see if I can address some of your concerns.
- When reading through a report it is always worth going through the entire document for a clear understanding of the information presented. For example, pp.2 of the report states “As the survey was a voluntary opt-in survey, the sample of people who opted to respond to it might not be representative of the general Wikimedia user base.” To me this statement leaves little room for misunderstanding that the findings of the report represent the entire Wikimedia user base; rather it can only reflect on the contributors who took the time to fill in the survey.
- You have concluded that “just about 33% of all Wikimedia project users have been or may have been personally victimized by revenge porn on a Wikimedia project site.” based on figure 13, pp15. Upon more careful look at figure 13, you will notice the statement “Out of 2,495 that responded to this question...” [followed by the figure, presenting the %]. If this is not a clear enough statement that the figures presented reflect only on the participants who answered that question, let me run some reverse math on your conclusion. By all means correct me if I’m wrong but, 54% of 2,495 respondents accounts for 1,347 users. 61% of those 1,347 users accounts for 822 [users]. Based on your conclusion’s logic this means that 822 users... account for 33% of all Wikimedia project users. I am fairly certain that this is a grossly inaccurate. Thus the very statement it was based on, also erroneous. If anything, the % presented can only reflect on the contributors who took the survey - not the entire Wikimedia community. This, I believe, is made clear on several parts of the report, as pointed above.
- In regards to the definition of revenge porn, it is worth noting that brief definitions to the types of harassment listed in the survey had been provided to the respondents, in an effort to avoid misunderstanding of said terms. Those same definitions are also listed in the report’s Appendix, as per note at the bottom of page 17 [which is where the different types of harassment appear in the report for the first time]. Revenge porn, for the purposes of the survey, had been defined as “publishing of sexually explicit or sexualised photos of without one’s consent”. Which means that this term [for the purposes of the survey and the subsequent report] includes more than just somebody's NSFW photo being publicised. It includes any kind of sexual or sexualised [:photoshopped] image that has been linked to a Wikimedia contributor, even if only through their username. This may differ from other definitions of revenge porn but it may justify the higher-than-expected % of respondents who selected it.
- Not all figures total 100%. It is possible for a respondent to have been subjected to more than one forms of harassment during their contributing to Wikimedia projects. As such it is possible that they reacted in more than one ways. Certain survey questions, allowed the respondents to select more than one of the options listed, and/or add their own if it wasn’t listed already. The survey was not a follow-up questionnaire about a specific a experience of harassment rather an inquiry into whether respondents had experienced harassment overall.
- When reading through a report it is always worth going through the entire document for a clear understanding of the information presented. For example, pp.2 of the report states “As the survey was a voluntary opt-in survey, the sample of people who opted to respond to it might not be representative of the general Wikimedia user base.” To me this statement leaves little room for misunderstanding that the findings of the report represent the entire Wikimedia user base; rather it can only reflect on the contributors who took the time to fill in the survey.
- The report that was released on Friday is a preliminary version, as per the file's description. We are certainly open to suggestions on clarifying certain points made, if those do not appear to be clear.Kalliope (WMF) (talk) 16:45, 1 February 2016 (UTC)
- Thank you for your comments Thekohser. Let me see if I can address some of your concerns.
- @Kalliope (WMF) — So why is WMF spending money on non-scientific surveys? Why are we not doing scientific sampling to generate survey participants? Why are we not presenting response data in tabular format but are instead producing a "report" that appears to be little more than a set of PowerPoint slides? Why are we hiding the percentage of men and women who took the survey (pg. 12) and why are we not more thoroughly examining the ways in which harassment and the reaction to harassment differs according to gender? It seems to me that this product is little more than a propaganda document to support an ongoing political debate rather than a serious examination of a very real problem. WMF would be wise to take up Mr. Kohs's offer of assistance in survey design. It would also be good to get a couple people with degrees in statistics on board. (And, for the record, I'm also a victim of harassment related to my Wikipedia activities, via a non-WMF attack site. It does go with the territory, lamentably.) Carrite (talk) 17:32, 1 February 2016 (UTC)
- Kalliope, your comments about the people answering the survey not necessarily being representative of the userbase is true but trivial. Of course no survey is perfect. But this kind of survey is useful only insofar as the surveyed population is close to the actual userbase, so as to reasonably extrapolate from it. One can't have it both ways. Your last point has already been addressed by thekohser above in reply to another question. Some other comments:
- The Pew survey breaks down harassment by gender, and also divides it into "less severe" (name calling and embarrassment) and "more severe" (stalking, physical threats etc.) This kind of thing should be done because as the survey notes, men and women experience different types of harassment differently (men experience name calling, embarrassment and physical threats more, while women experience stalking and sexual harassment more). Also, the most popular (and most effective) response to harassment is ignoring (as both the Pew survey and this study note): less severe types are more easily and effectively ignored.
- Also, young people experience much more harassment. The data should also have been broken down by age. Kingsindian (talk) 17:39, 1 February 2016 (UTC)
- Astonishing! Upon what foundation can you state with any hint of credibilty that young people "experinence 'much' more harassment" ? Citation needed, dude. I am so strangling that I wish I had taken an extra 'nerve pill' today. Please come be a fly on my wall. and yes, I participated in the survey and yes, I sadly fall into many demographics other than youth... Fylbecatulous talk 15:19, 2 February 2016 (UTC)
- The Pew Research Center survey he linked to. It's in the summary of findings. MLauba (talk) 00:21, 3 February 2016 (UTC)
- Astonishing! Upon what foundation can you state with any hint of credibilty that young people "experinence 'much' more harassment" ? Citation needed, dude. I am so strangling that I wish I had taken an extra 'nerve pill' today. Please come be a fly on my wall. and yes, I participated in the survey and yes, I sadly fall into many demographics other than youth... Fylbecatulous talk 15:19, 2 February 2016 (UTC)
- Also, young people experience much more harassment. The data should also have been broken down by age. Kingsindian (talk) 17:39, 1 February 2016 (UTC)
- As a member of enWP arb com I have certainly witnessed examples of revenge porn, but I find it almost unbelievable that using even the broad WP definition that 61% (or 31% -- I am not sure which number is the relevant one) of participants have received it. I urge it to be reported to us if related to enWP, for it's the sort of thing I will support our taking whatever on-wiki action is possible, and refer to the Foundation. (There has been some dissatisfaction that we have not taken action when the identity of the harasser cannot be reliably determined, or where the person receiving it is quite sure but our very limited investigatory power cannot confirm it; nonetheless it should always be notified--either to us or directly to the Foundation. If it really is a problem to the extent specified, I think we very much need to consider our responses. The same goes for similarly extreme forms of harassment. (But though I share GK's concern, I think that most people receiving this sort of harassment would very much like the matter hidden from public view as soon as possible, and we serve them best by doing so.) (All this is my personal comment, not that of the committee). DGG (talk) 20:55, 3 February 2016 (UTC)
- The good news is the mystery of the obviously false "revenge porn" numbers has been solved. The bad news is that the cause is a software defect that renders all the data on page 17 worthless and which makes any conclusions based upon them impossible. I'll cross post the summary I presented on En-WP at Jimbotalk under a hat for those of you who have not been following the blow-by-blow of the analysis on Wikipediocracy.
This discussion has been closed. Please do not modify it. |
---|
The following discussion has been closed. Please do not modify it. |
Defective WMF harassment surveyThe mystery of the obviously incorrect "revenge porn" results of the WMF Harassment Survey has been solved on Wikipediocracy by Belgian poster Drijfzand... Basically, this survey of 3,845 Wikipedians across a range of WMF projects (45% of whom were from En-WP) generated 2,495 responses to a question asking whether they personally experienced harassment. Of these, 38% (about 948 people) said yes. (pg. 15). However, on page 17, in what is purported to be a breakdown of the forms of harassment experienced by these editors, an astounding 61% (about 578 people) are said to have claimed to be victims of "revenge porn." This, to anyone who ponders the number for more than 6 seconds, appears patently absurd — bearing in mind that the survey respondents were about 88% male and that the great majority of Wikipedians maintain some degree of anonymity. Drijfzand observed that the number of responses for doxxing, revenge porn, hacking, impersonation, and threats of violence all fell within a range of 5% of one another — which she or he argued "simply can't happen." I theorized that the problem was a software glitch and Drijfzand identified the problem as a set of defective sliders in the survey form which refused to accept a value of 0, a bug identified by Burninthruthesky on November 3 and which was apparently remedied on November 4. LINK. Unfortunately, the survey was not launched on En-WP until Day 5 (to allow more responses from smaller Wikis so as to reduce the weight of the large projects, see pg. 2), meaning that bad data was generated on some projects for nearly a week. Whereas the survey should have been aborted and restarted, it apparently was not, and so the data presented on page 17 (and any conclusions derived therefrom) is a case of Garbage-In-Garbage-Out. Once again: a failure to adequately beta-test software is evident. There is one saving grace, and that is we have a very good snapshot of the magnitude of the gender gap based on survey respondents (a ratio 88:12 for those who indicated a gender, with some 7 % of survey participants declining to respond). Assuming a heavier-than-average percentage of women than men in the "decline to respond group," this means we are probably in the ballpark of 86:14 or 85:15. There is also, for the first time ever as far as I am aware, a decent survey of age of Wikipedians. Your takeaway numbers: 35% of respondents (and presumably Wikipedians in general) are age 45 or over; only 24% are under the age of 25. All the fresh faces, many on travel grants, at Wikimania are deceiving — it appears that the median age of Wikipedians is right around 31 years old, give or take. So the expenditure on the harassment survey wasn't a total loss even if it failed at its intended mission (at least in part) due to bad software (leaving aside the very real question of sketchy survey design). Carrite (talk) 19:50, 3 February 2016 (UTC) (male, age 54) Last edited: Carrite (talk) 23:00, 3 February 2016 (UTC) |
Please beef up the beta-testing. Carrite (talk) 23:15, 3 February 2016 (UTC)
What was the wording of the invitation?
How exactly did the invitation text read? - Thekohser (talk) 19:45, 3 February 2016 (UTC)