A thread for posting and discussing information and news relating to information warfare. By the way, here’s my understanding of what information warfare actually is. Information warfare involves using or (deceptively) manipulating information to harm an adversary–Two ways come to mind: causing political damage to politicians or institutions (e.g., any government agency, a news outlet, academic institution, etc.), often using information to create unfavorable impressions and/or undermine the faith of those individuals and institutions and widening existing social divisions and exacerbating social polarization. Basically, an adversary will manipulate and poison the information landscape–the public square where people get, think about, and discuss news and information. My understanding is that information warfare has always taken place (in the form of political propaganda). What’s different now is that technology has created a new platform, opening the possibility and capacity for greater information manipulation–enhancing the ability to weaponizing information, while also weakening defenses against them. (Actually, the type of defenses we need might not have existed; it’s something we may have to create.) In my view, politicians and prominent individuals, key institutions (and even businesses) and the general public are all vulnerable right now.
As for hyper-warfare, my understanding is that this is strategy utilizing unconventional means (i.e., not conventional use of military force), in a comprehensive way to weaken and defeat an opponent. In the case of Russia, my understanding the Russian government will weaponize anything and any part of their government or society, in whatever way possible. For example, the government controls banks, and those banks can be used to help launder money or use money to fund illicit activity; they will use organized criminals, oligarchs to corrupt prominent individuals or organizations in other nations, which creates kompromat that they can use to control those individuals. There’s a comic book character that I think of when I think of this approach, and that character is Bullseye. Bullseye was an assassin who was known for taking anything and turning it into a deadly weapon. He used to a play calling to kill someone, a table, whatever. Hyper-warfare is sort of like this. It differs in that the approach may not be so direct–as in, blowing up an adversary. The key here is guile, cleverness, deception, psychological manipulation. The other character that comes to mind is Loki–who I think of as someone who has to rely on his wits and ability to deceive and manipulate in order to defeat physically superior opponents. (I think this entails much of the type of information warfare I describe above.) I think of hyper-warfare as Bullseye + Loki.
In the thread, I hope to include articles and reporting to shed more light on the nature of information warfare, as well as give specific examples of it that is occurring right now. A good starting point is one of the first articles to introduce me to these concepts, Putin’s Real Long Game by Molly McKew.
To Read Later
Politico: Europe isn’t ready to face modern threats
Politico: How Twitter Bots and Trump Fans Made #ReleaseTheMemo Go Viral
http://nautil.us/issue/52/the-hive/modern-media-is-a-dos-attack-on-your-free-will
http://securingdemocracy.gmfus.org/blog/2018/01/16/so-what-did-we-learn-looking-back-four-years-russias-cyber-enabled-active-measures
From Business Insider How Russia Successfully Interfered with the 2016 Elections
https://www.nytimes.com/2018/02/18/world/europe/russia-troll-factory.html
From Strategic Studies Quarterly: Commanding the Trend: Social Media as Information Warfare
From Yahoo News: Play Fake News Tycoon Combat Misinformation
From Defense One: How to Inoculate Public Against Fake News
From the United States Army Special Operations Command: “Little Green Men:” a Primer on Modern Russian Unconventional Warfare, Ukraine 2013-2014
From Defense One: We’ve Lost the Opening Info Battle against Russia; Let’s Not Lose the War
1/4/2019
An interview given by a KGB defector in 1984 describes America of today and outlines 4 stages of mass brainwashing used by the KGB. https://t.co/wYYKs1aic2 pic.twitter.com/dQjRgENnpd
— Big Think (@bigthink) January 4, 2019
From The Daily Beast: Grassroots Media Startup Redfish is Supported by the Kremlin
Edit (2/8/2018)
3/8/2018
On Strategic Deception
From Just Security: The Public Needs a Lesson in Russian Strategic Deception: It’s What You Want to Hear
I think keeping what Sipher said above is wise when reading something like today’s NYT story, U.S. Spies, Seeking to Retrieve Cyperweapons, Paid Russian Peddling Trump Secrets
Edit (2/10/2018)
Another key passage as far as how Americans should perceive Russia:
On Fake News
From Buzzfeed: He Predicted The 2016 Fake News Crisis. Now He’s Worried About An Information Apocalypse.
Edit (2/13/2018)
I’m curious about how you feel about these statements, since you offer them with no comment. Does it surprise you and please you to hear that fake news is concentrated among a small subset of people, or that it probably didn’t change many votes?
I hope you’ll remember that this has been my position from the beginning, although at first I wasn’t sure and as the past year has passed, I’ve become more confident about this.
I am slightly surprised about most people not being in echo chambers. I haven’t clicked the external link for that statement but I will later.
And what is your response to Nyhan’s advice to “avoid loose talk about big persuasion effects and ubiquitous echo chambers?”
I posted the tweets because I think it’s important caveat to keep in mind. I think pointing out that these are just a handful of studies, and that they shouldn’t be seen as definitive–studies that settle the matter. But this also applies to articles and research that say the opposite as well. Finally, we should read all the articles above with care and nuance. For example, here’s the last paragraph in the report about echo chambers:
One thing to consider: Just because the non-politically engaged don’t live in echo chambers, that doesn’t mean all is well. The may face another significant problem–namely, confusion and uncertainty about what to believe or just plain ignorance of important information and events.
Also according to the study, echo chambers are stronger offline, than on–hence, we shouldn’t conclude that echo chambers aren’t a problem.
I think it’s a valid concern, and something that we should be wary about, including myself. Are you saying that I’m guilty of this?
Right. I’m not necessarily disputing this. But the alleged fact that people aren’t living in echo chambers definitely leaves room for increased hope. At least that’s how I see it.
I don’t know that I would call what you do “loose talk,” but I think there’s some overgeneralization, or some magnifying a small number of experiences to imply the proliferation of experiences. It has been my tendency through most of my life to under-worry, and it seems to have been yours (at least in the past couple of decades) to over-worry.
It is, but the authors say that the most politically engaged do actually live in echo chambers.
I think some of this is definitively go on, and I do tend to overworry, but I’m not the only one voicing these concerns–e.g., Nyhan: “Doesn’t mean fake news and other online misinformation aren’t big problems – they are! ..
I wouldn’t mind discussing this Foreign Policy article, American Democracy Is an Easy Target. On one hand, it is critical for those hyping the Russian threat to our democracy, creating the impression that the Russians have well-organized and well-executed plan. On the other hand, the author does seem to think actors like Russia can really damage our democracy by exploiting existing weaknesses in our democracies. There’s a disconnect here that I don’t really understand. Putin need not be a mastermind to a be serious threat. If we have significant flaws that even non-masterminds can exploit, then we should take seriously those that have shown a willingness to do this. Right?
Or maybe the author believes that ringing alarm bells about Russia obscures existing problems in our democracy. If that’s the case, I’m more sympathetic to this view. However, in my opinion, this is a both/and situation where our existing vulnerabilities plus malicious actors, foreign or domestic, pose a serious risk.
The author may also think that the general public’s perception of risk that Russia poses is already at an appropriate level. I don’t really get that sense. Or it could be that I’m over-estimating the threat Russia poses.
Thread that offers a kind of response to Brendan Nyhan:
Nyhan responds:
Nyhan and McKew’s discussion, as well as some other reports are summed up and included here at Niemanlab.org.
3/28/2018
Note: Comments in the timeline above worth reading. See some examples below:
I’m somewhere between the Kris-Stella Trump and Molly McKew.
Yep, this is a terrific article–informative and suspenseful–reads like a spy novel at times.
Edit (2/17/2018)
Edit (2/18/2018)
3/7/2018
3/12/2018
6/8/2018
Amusing and informative
Russia Isn’t the Only One Meddling in Elections. We Do It, Too.
From the NYT Sunday Review:
Some quick comments:
1. When people say, “U.S. and Russia have always been doing this,” I believe it’s important to understand that the internet/wireless technology, etc. create a totally different information landscape. It’s important to understand these differences;
2. My sense is that not all attempts to influence an election are equal–that is, some methods are more appropriate than others. Off the top of my head, I would say transparent and honest approach would be more acceptable. If a government explicitly made reasonable arguments, using facts, for a particularly candidate–that is more acceptable–than using foreign bots or trolls that pose a citizen or domestic organization, using fabricated stories to hurt a politician. I do not think the U.S. and Western democracies should use the latter;
3. The article mentions that undermining an autocracy while promoting democracy is different from the reverse. I agree with that. The former is something that I think is justified.
4. I don’t think democratic countries should really try to influence elections in another democratic country–even using the methods that I have said are more acceptable…Or at least I think another democracy should tread very lightly and moderately if they do engage in this.
A Woman Doesn’t Realize She Wasn’t Working With a Russian Group
I guess people can ridicule this woman, and feel some contempt for her, but I sort of sympathized with her for some reason, maybe because she’s older. It seems clear that she didn’t knowingly talk to Russians, and she seems to assume that the people and organizations she communicated with were legitimate Trump supporters. Yes, that’s unwise, but I just have a hard time harshly condemning her. By the way, a part of me feels like the issue here isn’t an unwillingness to admit that she was duped–although, if I think about this more, I suspect I would think it is a big factor. Maybe since the Trump supporters she’s met in real life have been genuine Trump supporters, she has a harder time believing that there are Russian sites attempting to egg on certain groups of Americans. In any event, this is a problem.
she doesn’t realize that she WAS working with a Russian group, right?
it’s easy to sympathize with her. I feel it too. I judge those people who followed those pages and, upon fb’s shutting those pages down that were run by troll farms, got indignant and pissed with fb for interfering with their right to be trolled. Obviously because fb is conspiring with the liberal media. Ugh.
Yes to the first question.
To be clear about the second point, you’re mad at FB for shutting down Russian troll pages? You’re angry that this is a conspiracy with liberal media? Or were you being sarcastic?
No. I’m judging people who are pissed about FB shutting down troll-farm-administered FB pages. I’m not mad, but I am making assessments on their intelligence and reasonableness.
I wasn’t being sarcastic with
I was quoting them. And I assure you that “ugh” isn’t sarcastic but an understatement if anything. Disgusted is probably a better word.
OK, got it. I’m assuming they made a first amendment argument–that FB shouldn’t restrict or censor information. Or were they angry because they felt like FB’s shutdown was due to political bias?)
It was both. I also think they genuinely received legitimate info from these pages, info they considered valuable. I guess they didn’t like having to go and get it from other places now.
Lies, Conspiracy Theories, and Hoaxes Poisoning Information Environment, Making Us All Sick
This is a good article–not necessarily about infromation/hyper warfare, but about the prevalence of bad information–namely,
lies, conspiracy theories, and hoaxes–in our information space. This is a relevant issue because the problem indicates that we, as society, are struggling to minimize these pollutants–and this creates a huge vulnerability in an information war. Minimizing these pollutants is aligned with helping us combat information warfare.
Edit
Good companion thread:
This raises an interesting point: If a propaganda site/troll also sometimes provides reliable and valuable information, then what? For me, though, I think the answer is pretty clear. Namely, if there is compelling evidence that group/entity has hostile, nefarious goals, a) I immediately tune them out; b) I’d want platforms to remove them; c) I’d want some action to discourage/deter those types of activities.
The key discussion would involve standards relating to what constitutes a hostile goals, what constitutes compelling evidence, etc. We need to have that discussion.
Well FB is pretty clear in its terms of service that sock puppets aren’t allowed, so the content and intent don’t have to violate the terms in order to be withdrawn, as long as the user account running the page isn’t a true (real) user.
That seems like a good policy, but how do they know if someone is a sock puppet?
I’m not sure if this is accurate, but if many Trump supporters are complaining because many of their followers were removed because they were Russian trolls/bots, that’s kinda funny, but maybe a little more disturbing.
It could be that Russian trolls and bots are pushing that hashtag:
What should determine which information gets the most attention in our society?
Zeynep Tukfekci, a sociology professor at Chapel Hill, has talked about the way social media platforms are about capturing our attention. When I talk about information filters, I’m thinking of tools, institutions, or processes that help individuals sort good information from bad–which, to my mind, also relates to helping people receive information in a way that is meaningful and coherent. But the factors that go into influencing what we pay attention to–via the media–seems like something different. (I haven’t really thought about it deeply. And it seems as if an individual or group could easily manipulate what a lot of people pay attention to. This seems like a disturbing reality (if true), especially for those with malicious intentions.
This raises a question: how should we determine what information receives the most attention at a given moment? Who should do this? These are questions I’d like to explore. Off the top of my head, I would say that we should find ways to prevent gaming the process–e.g., using trolls and bots to divert attention from one thing to another. I’m actually also wary if our attention is dictated by popularity or by the individual–at least not for news; I tend to think professional journalists and experts should have more say in this, although this may not be mutually exclusive. That is, we may have both methods operating at the same time, although I suspect the general populace will tend to favor one over the other. To be clear, I’m talking about larger processes that govern what gets the most attention on the internet–and even traditional media. Anyway, that’s a few thoughts for now, and I hope to revisit these questions later.
The Goals of Authoritarian Regime in an Information War Against Democracies
I believe the goals in an information war are very different from a war involving military or kinetic conflicts. Here’s a first attempt to jot down some ideas about the goals of the former.
1. Create a situation where general populace doesn’t believe that facts really don’t exist, don’t matter or can’t be really known if they do exist.
2. Create a situation where general public is confused about key facts or arguments;
3. Create situation where general public movies further away from logical, civil discourse, while increasing acceptance for conspiracy thinking and emotional decision making;
4. Exacerbate existing divisions within the democratic society.
5. Weaken or destroy democratic institutions and governments or the public’s faith in them–especially those that the society depends on for good information and sound arguments.
6. Interfere with elections so public doesn’t trust results.
12/18/2018
Another:
To be continued…
If these are the goals for authoritarian regimes, how should democratic countries respond?
Attacking the Press Can Be a Type of Information Warfare
From McClatchy: Hoax attempts against Miami Herald augur brewing war over fake, real news
Someone created false stories/tweets, attributing to The Miami Herald. The reasons are unknown, but these
And
In this case, some individual or group may have been trying to scare people. However, I can see how authoritarian regimes or enemies of a democracy would do something similar to undermine faith in the news outlet. The press is a key institution in a democracy. If an enemy can undermine the faith in the press that can really cripple the society. It’s like taking out the communication system of an enemy in a conventional military conflict.
How Corruption Can be Weaponized
From The Atlantic: The Dark Art of Foreign Influence Peddling by Thorsten Benner.
Mueller’s recent indictment on Russian interference and Paul Manafort kicks off the article, which describes more broadly how authoritarian regimes pay former politicians to lobby on their behalf, not only to help achieve policy goals, but also improve their image, adding legitimacy to these authoritarian regimes.
Authoritarian regimes can also give large sums of money to banks, non-profits, sports leagues, churches, universities, and political parties–in return those groups can do things to help the image and agenda of those regimes. (I imagine that if they take money, become entangled with these authoritarian regimes, they run the risk of exposing themselves to blackmail from these regimes.)
One solution:
Bottom line: Do not take money from authoritarian regimes or cutouts for authoritarian regimes. It is a deal with the devil. Those who do so should be looked at with some degree of suspicion and caution in my opinion.
Fighting Back
My sense is that the U.S. is struggling to know how to fight back against Russia, other authoritarian regimes in and outside the information sphere. One idea came to mind (Oh, besides learning from countries near Russia, who are familiar with Russian information/hyper-warfare): Let’s turn inward and examine what we Americans, value, and hold dear–what ideals are most important to us, that we’re most proud of? What the moments in our history that make us the most proud? Who are the Americans that we most admire? What are the ideas and beliefs that bring us together? I feel like we need to reexamine these things, relearn them, and then come together and celebrate them. This is a good starting point. If Americans can tap back into these things, and reaffirm them, we’ll be ready to take on authoritarian regimes, radical religious groups like IS and al Qaeda as well.
Edit
In case it’s not clear how examining and reaffirming are values will help us fight information/hyper-war, I want to explain myself:
1. One goal of Russian information warfare is to widen existing divisions in our country. By focusing on the things that we value, I believe we can be drawn together, making it harder to divide us.
2. As more Americans do this, I think it will expose the Americans who support ideas and positions that are incompatible, if not antithetical to the type of value and ideals most Americans embrace. Equality among all people, that we should be judged by the content of our character, not the color of our skin, e pluribus unum–as Americans celebrate this, it will be harder for politicians to push for enthno-nationalist policies and ideas; it will expose those who support those ideas and policies; and my hope is that they will not feel good about this, and turn back to the American ideals that most Americans embrace.
3. Doing this can strengthen are leaders and remind and encourage them to hold up these values to authoritarians like Putin, Xi, Duterte, Erdogan, Orban, as well as populist, ethno-nationalists like Marie Le Pen. If done well, we can bring together and lead other liberal democracies. (It’s sad that our president is not only not doing this, but he’s doing something closer to the opposite.)
Russia Could Control Water and Power Plants, Including Nuclear
3/16/2018
I’ll put this here.
How to win the (hyper) war?
Thoughts on Dealing with Russia’s Use of “Plausible Deniability”
The passage below explains the way the use of plausible deniability can stymie efforts to respond to Russia:
Here’s my thought on how to counter this:
1. Create a report–it could be an article, TV program, or film documentary–that cites incidents where this has occurred, including evidence to back this up. The report should establish a clear pattern of behavior. (Also, Hollywood should consider featuring villains that use the same tactic.) Show this report to the world.
2. After heavy publicity, one response might be to explain to Russia that other democracies cannot trust Russia, and if Russia values healthy relations with other nations, they need to take a series of steps that prove their good faith. (I’m not sure what those would be right now.)
3. If they fail to take these steps, let them know that Russia is in bad standing with the rest of the Democratic countries, and that there will be a series of consequences for that. Part of being in bad standing means that other countries will not give Russia the benefit of the doubt, that the assumption will be that Russia is acting in bad faith.
4. If they continue to engage in hyper-warfare, then they can expect escalation of conflicts.
The logic here is simple. When you meet an individual or organization that behaves deviously and in bad faith–and there’s a pervasive pattern of that–then the reasonable response is social censure and other negative consequences. The guilty person will not be trusted; people will assume that they are acting in bad faith, even when they may not be. The guilty person has a huge mountain to climb in terms of winning back trust and good faith. I believe Russia is in that position now. They democratic countries can offer a way back to that good faith, but until then they will be seen as a bad actor, and if they continue, serious consequences could occur, including military response.
Having said all that, there’s something else that should be considered, something a bit out in left field. A bigger problem with Russia may be finding a narrative and identity that gives the Russians a sense of pride–while also not involving a desire for imperial power. If Russians can find a narrative that allows them to behave in good faith, while also making them feel good and proud about themselves, particularly in relation to the world, then I think that’s ultimately going to solve the problem.
(This relates to the same problem with white Americans who are losing cultural power, and Muslims in the Middle East who feel humiliated by Christian-Jewish nations. All these groups have or have had power and pride from their culture/civilization. When their culture is diminished or diminishing, that’s going to stir up great resentment, fear, and anger. I’m thinking if we can find a way they can cope with these changes and also find a new narrative to that will not only help them accept the situation, but also make them feel good about their culture, we can get to the heart of the problem.)
How Can Liberal Democracies Fight an Information and Hyper War?
One big challenge is that information warfare is shadowy, almost ghost-like–not just in terms of the surreptitious nature of the fighting, but also the ephemeral impact of such warfare. The public doesn’t really feel the impact in the same way as a military or terrorist attack. An information attack just is based on word and ideas and it’s hard to see how those can really harm a society. To confuse matters, open societies, by definition, are open to all kinds of words and ideas, including ones that can be distasteful. Indeed, many Americans pride themselves in living in a society like this.
These things make defending against an information war very difficult–I have difficulty getting a strong grasp on the problem and articulating it….
…Some thoughts of how liberal democracies will have to fight information and hyper war:
1. Clearly articulate the nature of the threat–in a way that is palpable and compelling to citizens.
2. Create a public campaign against authoritarians and others bad actors who are attacking liberal democracies via information and hyper warfare.
(I guess these are obvious?)
More later…
Edit
With regard to #2, part of the public campaign should involve a huge history lesson–specifically about despots and dictators, and the way the Founding Fathers tried to create a system that would protect against such individuals–and why that’s so important. The campaign should remind everyone of the American leaders who have conformed their behavior to spirit behind this system, in addition to submitting to the rule of law and checks on power. As I mentioned above, part of this campaign should involve public gatherings that celebrate these things. I think this is an important part of confronting the threat….It’s important to remember that this isn’t just information/hyper warfare, but information/hyper warfare conducted by authoritarian regimes–the goal being to undermine democracy, undermine faith in democratic institutions, cause people to believe that liberal democracies are really no different from authoritarian regimes, that there’s no way of knowing the truth….This is what we have to battle.
3. Strengthen the free press–by giving them sustainable funds and independence. To my mind, in the information war, the press becomes information warriors for liberal democracies. If authoritarian regimes are trying to poison information environments and undermine trust in democratic institutions, a robust free press are the custodians and guardians of these institutions. (I should note that as guardians, this doesn’t mean that they’ll defend institutions when they behave inappropriately, if they are corrupt, etc. They are guarding the legitimacy of the institutions, which includes exposing wrongdoing that occurs within those institutions.)
4. Campaign against corruption. Part of authoritarian hyper warfare seems to involve a variety of ways of infecting liberal democracies with corruption. Reducing corruption seems like a key part of fight authoritarian hyper warfare. This should include individuals, institutions in liberal democracies not taking money from sources that have strong ties to authoritarian regimes.
With the possible exception of #2 (and probably not even that), most of these seem defensive. Is there any offensive measure that could be taken? Some ideas:
Cut off assets and money for oligarchs of authoritarian regimes.
Be more aggressive in promoting democracies, free elections, building democratic institutions (e.g., support of free press in other countries, etc.)
One idea, although I don’t know how this will be implemented. In a work place or smaller communities, people who behave deviously will soon become social pariahs, and I tend to think that can be a strong deterrent. Can we do something to regimes that behave this way?
5. Regulate internet and social media platforms. If the government doesn’t do this, the platforms themselves will have to change. Or can there be non-government institutions that somehow play a roll in regulating information space online?
Seizing Authoritarian’s Money and Banking Reform to Combat Corruption
From WaPo
(emphasis added)
There is No Such Thing as Cyber Deterrence
From The Cipher Brief.
How is Cyberspace different from conventional battlefields?
I got lost with the following passage, though:
Crazy Conspiracy Theory Alert
I’m saying this not to mock the following tweets, but to serve as a caveat. Also, I want to signal that I’m not really comfortable with speculating about this, as it sounds like a crazy conspiracy theory. I’m still uncomfortable, but I don’t think this idea can be completely dismissed.
How Does the Press Filter Information Operations from Adversaries?
The following tweet, and some of the comments, made me think of this issue:
If foreign countries or malicious actors want to use information to divide the nation, undermine important institutions or political leaders, or just muddy the information space to cause confusion or strategically manipulate attention, in ways that hurt the U.S. and/or help it’s adversaries, I think the press has to be mindful of these objectives–and have to include this in the way they filter information and decide how to write about a given information.
This is a complicated matter, and I want to the different types of scenarios where an adversary strategically uses information for specific objectives:
To exacerbate existing divisions. Suppose a foreign adversary obtains sensational information and releases it to inflame certain factions in the U.S. The information may be true. And suppose the foreign adversary gives this information to the press in a way that is strategically advantageous. Suppose the information adds fuel to an existing fire.
Should the press publish the information?
More later.
Another potential example.
Thought: The press can’t think of themselves as neutral in an information space where individuals, groups, and nations are weaponizing information. Press has to understand the nature of information warfare, and then come up with rules and strategy for addressing this. Another way of saying this: They have to re-think their role; because of the changes that have occurred, their role should also change to suit these changes.
Here are other thoughts on why the press isn’t or can’t be neutral: Is the press neutral about the importance of facts, meaning, context, and reason? Are they neutral or agnostic about what constitutes a good political discourse and a healthy public square? Are they neutral about political systems–e.g., liberal democracy vs. autocracy? I’m pretty sure the press isn’t agnostic about these things, and if not then they will almost certainly have to re-think their role–they’re not going to be agnostic towards bad faith actors, intentionally using information, including in dishonest ways, to undermine democracy and liberal democratic institutions and countries.
I’m not clear about how they should go about doing this, but I’m fairly certain they have to re-think their roles and come up with a new approach.
U.S. Kicks Out Russian Spies Suspected of Tracking Russian Defectors in the U.S.
Poisoning in the UK of a former Russian spy and his daughter was probably a big reason for this.
Thread
11/29/2018
Another thread by Starbird.
Not Just Russia
I didn’t get to this yet, but want to later.
I didn’t read the 202, but I read the article reporting on the report. I need to get an understanding of how the analysts determined that the Russian effort was “effective” and how it became more so. That’s the part I’m most unsure about in this whole affair.
I don’t know what happened to your comment, Reid. Most of it is quoted here in my response, but somehow the comment I quoted has disappeared.
Heck no. It’s a hostile action, and if the current executive administration was involved, it’s totally a crime against this country.
Yes, I’ve heard similar stories (admittedly from episodes of MASH) of North Korean leaflet drops. But I’m told the US has spread similarly comical propaganda in other parts of the world (no confirmation). Do we know how substantial the broadcasts of Radio Free America are?
That is really good to hear, and I say that because I feel like I’m in a small minority. I don’t know if you feel the same, but these Russian attempts piss me off, and I see them as an act of (information) war.
I’m not sure what you mean by “substantial.” Are referring to the level of effort? level of effectiveness?
By the way I see propaganda–the promoting of certain views as not necessarily the same as active measures and disinformation. I would not be in favor of the U.S. and other Western allies using disinformation and active measures on our adversaries.
Important question and caveat:
Back and forth between Silver and Zeynep Tufekci:
(The debate kinda reminds me of my football arguments with analytics proponents.)
https://www.wired.com/story/2018-was-a-rough-year-for-truth-online/?mbid=social_twitter&utm_brand=wired&utm_campaign=wired&utm_medium=social&utm_social-type=owned&utm_source=twitter
(I didn’t read this yet, but hopefully I will soon.)
The article mentions four tactices–deep fake videos, AI, super doxxing, and agitating disaffected groups in a society.
To combat deep fake videos and other fake news, I thought of going to organizing discussion groups, lead by journalists and comprised of ideological diverse members. The journalists would have to work hard to finding good information, but if they distribute this information in these groups, the members would be able to rely on this information a lot more. It’s harder for someone from another country to distort the information in this context.
However, I also realize that maybe older technology like TV might become more valuable. Instead of getting information from the internet, people might turn back to radio and TV–two media that are harder for information warriors to penetrate.
5/23/2019
Now, Trump is re-tweeting the fake video:
Dang it! This is so wrong!
5/24/2019
Facebook is leaving up the doctored video(s) of Speaker Pelosi. They claim to be sending warnings to people sharing the video that the it is fake, but they will not take it down. Bikert explained that FB believes when misinformation/disinformation enters political discourse, instead of taking it down, they will leave up, flagging it in some way to indicate that it is fake. Does this sound like a better move than taking it down? It doesn’t seem that way to me. It’s hard not to imagine that many people won’t realize the video is fake.
Additionally, I would think this is only going to invite actors like Russia to post doctored videos to hurt candidates, institutions, or who knows what else. It doesn’t seem like a good decision by FB.
I didn’t read the the article, but my reaction to the tweet–particularly the idea that Russia doesn’t allow their soldiers on social media: the Russians get it, and we don’t. Now, maybe the U.S. military can’t prohibit their personnel from going on social media, but are they at least providing extensive training, to help their personnel? I don’t know what they’re doing, but the impression I get is that we don’t fully realize the threat that information warfare, turbocharged by social media and the internet, poses to our national security.
(Haven’t read this yet, but hopefully will do so later.)
At one point, this is the kind of thing where I thought: “No one would go this far–a sense of ethics or existing norms would prevent someone or some organization from doing this.” Maybe American businesses wouldn’t, but I now believe nations or other groups will and are doing things like this to manipulate the information environment. I’m not sure how you guys feel, but I don’t this is good for the internet, social media, internet interactions, and political discourse.
A big reason I tend to accept these claims is that I find the threat very credible. Weaponizing corruption, organized crime, finance, to me, means a state taking these aspects and infecting other countries with them to gain influence and/or weaken those societies, especially liberal democratic ones. Disinformation is a similar type of disease, but one that poisons a democratic society’s public square. I think all of these things can be seen as a form of disease that can threaten a liberal democracy.
By the way, Russia isn’t the only threat. I would broaden this to any authoritarian country or even individuals wishing harm. Additionally, states or individuals may not be using these tools strategically–it may just be emanating out of their country.
In any event, I think this is a big deal. And I think the way to address this is to bolster Enlightenment values, the free press, and initiate anti-corruption reforms. What I think of as progressive reforms dovetails with protecting our national security.
5/14/2019
5/15/2019
(Haven’t read the article below yet)
5/17/2019
This would be really bad if similar things are going on in the U.S. and other Western countries:
Not really about information warfare, but about Russia more generally.
The thread these two former CIA guys are referring to:
This is not just a Russia issue.
Ugh. I really think we need a discussion about conspiracy theories–how we should be wary of them
Important thread
The takeaway–the Kremlin will manipulate Democrats and Republicans–or any group or individual in America–to divide the nation and cause chaos. Let’s say Russia has Trump’s tax forms, and they get to a cutout and that person approaches a Democrat. The Democrat–and Trump opponents–should be incredibly wary and hesitant. Actually, off the top of my head, I think the Democrat should report this to the FBI right away and not take take the information. If the Kremlin is doing something they’re doing something in their interests, and based on my understanding, they see dividing the U.S., weakening faith in democratic institutions, exacerbating divisions as a way to achieve this. Opposition and even hatred for Trump and Republicans shouldn’t prevent people from overlooking this. Russia and any adversary employing the same tactics should be seen as a bigger threat and enemy than the other political party.
To me, these are hostile acts (from WaPo):
I hope Americans get as mad as I do whenever they think about this.
If the two parties were healthy, they would be uniting over protecting the integrity of the elections. They would push back hard against a POTUS making baseless claims that mail-in voting is unreliable.
A thread and a post to comment on later:
Thread by Thomas Rid
and
One more:
One more:
I didn’t read this yet, but what the heck are they up to with this?
Based on my (limited) understanding of the concept of hyperwarfare, I consider the recent ransonware attacks in the U.S. a form of hyper-warfare. Hyperwarfare isn’t something that are blatant acts of war. They’re actions that exist in a gray area; the average citizen may not view it as acts of war.
Features that allow the nation state to evade blame seems to be another aspect of hyperwarfare. In the ransomware attacks, my understanding is that the Russian cybercriminals are largely behind the attacks. However, the Russian government does have the capability of stopping the activity of those groups. If Russia views the U.S. as its main adversary, then they benefit from allowing the cybercriminals to attack the U.S.
To me, it seems clear these attacks are essentially acts of war–and the Russian government is responsible, at least indirectly.
Here’s Clint Watts talking about the attacks:
60 Minutes also had a recent feature about the ransomware attacks.
My takeaway from this: We’re getting our butts kicked. It really undermines the idea that the U.S. has some of the best cyber capabilities in the world.
We are getting are butts kicked, part 500
There’s a clip in here with President Biden saying he told Putin that we have great cyber capability, or something to that effect. I think the President said that he warned Putin of serious consequences. I call BS on both. Maybe the U.S. government is doing something behind the scenes–fine. But the bottom line is that whatever they’re doing, it’s not stopping the Russians. The threats from the U.S. seem extremely hollow to me.