Also, hi, I’d like to add YouTube recommendation algorithm fueling extremism and conspiracies globally to the problems super smart people at Google are tasked to solve. It’s not that the other stuff isn’t amazing, it’s just that it won’t matter if the world is on fire.
#AMLD2019
-
-
Show this thread
-
Mind-blowingly good talk at
#AMLD2019 by Jeffrey Bohn, apparently of@swissre. Guess no surprise that the insurance/reinsurance industry harbors this kind of sophistication. Someone get one of these people to write a book. Every bullet point was

.https://twitter.com/JLPlante/status/1089835139230765056 …
Show this thread -
Also hi, “AI and China” panel without anyone (yet) bringing up the role of the state or the use of AI/ML for social control, censorship or surveillance.


#AMLD2019Show this thread -
Social credit comes up, as the only thing remotely touching the giant undiscussed topic, and the only concrete thing I heard so far is that the narrative in the West is "unfair". Moving on, I guess.
#AMLD2019Show this thread -
I'll just put the HRW report about the use of big data methods in China to identify people to extrajudicially place in indefinite detention in camps here. These are potent technologies—and the research community has to keep all this in mind!
#AMLD2019 https://www.hrw.org/news/2018/02/26/china-big-data-fuels-crackdown-minority-region …pic.twitter.com/lUmVAfyTzJ
Show this thread -
First pic, regular TensorFlow. Second pic, tf-encrypted: a python lib for privacy-preserving ML. This is the track that needs *all* the attention. ML community can’t control that its output will be used for good (it often won’t be) but it can develop tools that do good.
#AMDL2019pic.twitter.com/SKu0G54pdq
Show this thread -
Now, a talk on federated machine learning. Love this track. *This* is stuff the “AI for good” people should work on: first, limit the massive harms on your own tools. The challenges of ML are bias, surveillance, potency at scale, interpretability—often in combination.
#AMLD2019pic.twitter.com/SciUOlQ76H
Show this thread -
Because I keep having a typo in the hashtag, reposting. Here's a comparison between regular TensorFlow code and tf-encrypted—with privacy-preserving python lib addition. I'm constantly surprised how relatively small these research programs are given their importance.
#AMLD2019pic.twitter.com/BTFkaC4Uff
Show this thread -
I love love love the idea of musical interludes at conferences. Plain awesome.
#AMLD2019pic.twitter.com/xrRrAPAZKC
Show this thread -
Great talk on intelligence and machines by
@Kasparov63 but I disagreed with a big chunk of it though I share his optimism but I arrive at it from very different path! It’s actually fun to disagree with a fellow optimist. Will add a few points to my talk tomorrow.
#AMLD2019pic.twitter.com/nXXvxiOTZ0
Show this thread -
Cool presentation by
@NicolasPapernot on machine learning security based on Saltzer and Schroeder principles. The security/privacy track at#AMLD2019 has been robust and the rooms are packed. (As I keep tweeting, this is super important for the research community to prioritize).pic.twitter.com/KfEFrwmnsO
Show this thread -
Almost talk time! Excited and honored to address the research community.

#AMLD2019pic.twitter.com/z4EALna0rr
Show this thread -
Talk done! This slide is something I will write long-form. Bias in ML is a real issue if it reflects existing biases or choice of optimization targets, real solutions are not confined to ML. There are other, urgent research within-ML questions!
#AMLD2019 (pic by@AnnUkhanova)pic.twitter.com/vz8SVMU6lc
Show this thread -
Detection *at scale* is a game changer, something that’s (already) restructuring the balance of power between people, corporations and governments. With that, bye
#AMLD2019. Thank you for a great experience.https://twitter.com/dylanrmuir/status/1090268562369871875 …
Show this thread
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.