here's a translation: "our workforce couldnt sleep at night knowing people were literally getting murdered because of this stuff spreading throughout our platform, so we're proposing a solution that only further muddies the waters in other aspects"
-
-
এই থ্রেডটি দেখান
-
this isnt me being flip: this particular problem is rending the consciences of good, conflicted people inside the company, so not acting on it wasn't going to happen
এই থ্রেডটি দেখান
কথা-বার্তা শেষ
নতুন কথা-বার্তা -
-
-
It’s also a completely tautological category. How do you know a piece of misinformation will lead to violence, unless it already has?
-
kind of the world's worst litmus test
-
Do you guys think this is a hard line to draw? Like, if you were a moderator for US, you couldn't spot stuff?
-
i mean, i literally have no idea why people kill and assault people. sometimes people get killed for cutting someone off in traffic. what am i going to do to tell what posts stir up violence beforehand?
-
Facebook needs to entertain the conversation of what’s acceptable. Anything else from them is considered deflection to me
কথা-বার্তা শেষ
নতুন কথা-বার্তা -
-
-
The interesting thing is that journalists all think FB should censor stuff and most people who build things here don’t. I really don’t understand why team journo thinks this is a such an unreasonable stand.
-
i dont disagree with you that the irony exists as journalists are calling for censorship. what I propose is clarity. just say what they mean, don't back into it.
কথা-বার্তা শেষ
নতুন কথা-বার্তা -
-
-
i mean all right wing media falls on an ideological spectrum that eventually culminates in violence?? they must know this right???
-
if you follow this logic trail even slightly beyond where it is now it falls apart
-
there is way to much unearned good faith here !!!!
-
Easy solution to the problem.
#DeleteFacebook
কথা-বার্তা শেষ
নতুন কথা-বার্তা -
-
-
Removing content that directly incites violence is a standard policy on most content platforms. Identifying content that directly incites violence results in faster/more accurate outcomes if you’re working with large ops teams. I think this has been FB’s policy all along.
-
The change sounds like FB is expanding the scope of what “inciting violence“ is by considering some level of intent which is difficult to do at scale. So, their solution is to have trusted partners verify that there is intent to incite violence before removal.
কথা-বার্তা শেষ
নতুন কথা-বার্তা -
-
-
I think they mean *directly* leads to violence, but I’m only halfway through the interview. It sounds like sustained misinformation campaigns that might indirectly led to dangerous situations (if you believe elections maybe have consequences) don’t make the cut.
ধন্যবাদ। আপনার সময়রেখাকে আরো ভালো করে তুলতে টুইটার এটিকে ব্যবহার করবে। পূর্বাবস্থায়পূর্বাবস্থায়
-
-
-
"we're not responsible until we are"
- কথা-বার্তা শেষ
নতুন কথা-বার্তা -
-
-
do you remember the first time you got high and talked philosophy?
ধন্যবাদ। আপনার সময়রেখাকে আরো ভালো করে তুলতে টুইটার এটিকে ব্যবহার করবে। পূর্বাবস্থায়পূর্বাবস্থায়
-
-
-
Just a PR catastrophe. But don’t blame PR. You can’t polish a turd, and this is a company problem, not a PR problem
ধন্যবাদ। আপনার সময়রেখাকে আরো ভালো করে তুলতে টুইটার এটিকে ব্যবহার করবে। পূর্বাবস্থায়পূর্বাবস্থায়
-
লোড হতে বেশ কিছুক্ষণ সময় নিচ্ছে।
টুইটার তার ক্ষমতার বাইরে চলে গেছে বা কোনো সাময়িক সমস্যার সম্মুখীন হয়েছে আবার চেষ্টা করুন বা আরও তথ্যের জন্য টুইটারের স্থিতি দেখুন।