Page 149 - Flaunt 171 - Summer of Our Discontent - Lili
P. 149

 LET’S START WITH KEEPERS OF THE CITY, SHALL WE?
Written by Audra McClain
Before you go anywhere, it might be wise to understand what anywhere in the contemporary context means. Locate yourself. We’re about to break it down. “Anywhere”, by Oxford’s definition means: “in or to any place” but by the U.S.A.’s definition that means subject to police protection and oversight, right? The fourth amendment assures our right to privacy, but how much privacy do we really have? How much will we have in ten years’ time? The an- swer is in the air and it goes hand-in-hand with the question of whether police departments should be allowed to use facial recognition technology.
America is split on the issue. In places like
San Francisco and Boston, the technology—for the purposes of law enforcement—has been banned for the time being. Tech companies like Amazon and Microsoft are refusing to sell their facial recogni- tion technology to police departments with hopes that federal regulations are soon put in place. Brad Smith, president of Microsoft, was quoted on his refusal to sell the technology saying, “We will not sell facial-recognition technology to police departments in the United States until we have a national law in place, grounded in human rights, that will govern this technology.”
But just because these well-known companies aren’t selling their tech, doesn’t mean other com- panies aren’t jumping on the opportunity to do so. States like Florida and Michigan are still using the technology, despite the controversies surrounding it.
A report by The National Institute of Standards and Technology’s (NIST) showed that with ideal con- ditions the best algorithms can be accurate over 99% of the time. That number starts to decrease as other factors such as lighting and positioning come into play. Even with pristine conditions the accuracy rate is only close to perfect with an error rate of 0.3%.
How about our pandemic coat of arms? When an individual is wearing a face mask the best tech fails around 5% of the time, and with other algo- rithms, it can fail up to half the time. These error percentages may sound small, but when it has poten- tial to be used on millions of people... suddenly that number doesn’t seem so tiny.
Just this summer, facial recognition technology wrongfully identified a Black Michigan man as the suspect in a 2018 robbery of a retail store in Detroit. After procuring a warrant, police arrested Robert Williams in front of his distraught family. He was in- terrogated and detained for thirty hours, only for the authorities to later realize their suspect was not their man—falsely identified by his driver’s license photo, presumably one of thousands combed through by an algorithm. Facial recognition technology failed Robert Williams.
In 2016, America had over two million people in its prison system. It is estimated that 20,000 of those people have been wrongfully convicted. How many more innocent people could these irresponsible technology mistakes put behind bars?
Not only does the technology sometimes fail despite perfect conditions, but it is also more flawed
when used to identify Black people, and even more specifically with Black women. In a study by The NIST, the tech incorrectly identified Black women 10 times more than that of white women. This is shock- ing, though perhaps not surprising, given the conver- sations being waged at present concerning systemic inequality. Even our tech is disenfranchising.
On May 25th, 2020, George Floyd was murdered in broad daylight by four Minneapolis police officers. His death and the many other deaths of innocent Black Americans pushed people across all of this country to exit their homes and enter the streets to protest. Protest against police brutality in America. Protest against racial profiling. Protest against the wrongful treatment of people of color. Protest against the system in which these issues thrive.
Unraveling months later are the events in Port- land where these protests are still flourishing. Like many of the big cities, conflict has emerged between the police and protesters, but unlike many cities, Portland is faced with a new issue: deployed federal agents. Federal agents were ordered by President Donald Trump to ensure no further violence erupts and results in property damage. Acting as guards, the agents have used tear gas, rubber bullets and batons to “tame” the protesters. Additionally, there have been claims of unidentifiable agents placing protest- ers in unmarked vans.
But, was Portland ever really in such a violent state that this needed to happen? Or is this false information meant to deceive the public? Disinfor- mation placed by officials to paint the protesters as evokers of violence, as the creators of the mass cha- os? Disinformation meant to discredit all the change that these protests have already brought about and will continue to bring about? Disinformation to make it seem as though the president’s order is meant to keep America safe and “great again”? Disinformation to help Trump’s likeability?
According to Portland Mayor Ted Wheeler, the deployment of the agents has caused more tension than before. “Their presence here is actually leading to more violence and more vandalism,” Wheeler told CNN, “And it’s not helping the situation at all. They’re not wanted here.”
Alas, disinformation campaigns are not a new concept in this country. Time and time again in history, superpower nations have purposely misin- formed their citizens for the purpose of political and socio-economic control. The Soviet Union’s KGB was notorious for it, and the US saw itself matching, if not exceeding these tactics—fake organizations, forgeries, rigged public stunts, arrests, deaths— during the Cold War.
A new book by Thomas Rid, a political scientist best known for his work on the history and risks of information technology in conflict—Active Measures: The Secret History of Disinformation and Political Warfare—published this April by Farrar, Straus
and Giroux, brilliantly categorizes the roughly one hundred and twenty year history of so-called Active Measures—what intelligence officers used to call
149















































































   147   148   149   150   151