“They don’t understand how it works.” Information Technology and the Queasy Underbelly of Democracy

Politicians low on the tech learning curve

Alexander Nix, CEO of Cambridge Analytica, and chief architect of the Trump-assisting “defeat crooked Hillary” campaign, commenting on his testimony before the (U.S.) House Intelligence Committee, said “They’re politicians, they’re not technical. They don’t understand how it works.”

The exploits of Cambridge Analytica in suppressing votes and unleashing torrents of misinformation and flat-out falsities upon the data rivers of social media got (as usual, excellent) coverage by The Guardian in this piece dated March 21, 2018: Cambridge Analytica’s Assault on Decency For more on Nix, the Facebook data breaches, and the “crooked Hillary” campaign.

This echoes a theme emerging from previous U.S. Congressional hearings dealing with social media: politicians are way out of their depth in advanced information technology. As Nix, says, they simply do not understand how it works.

And that is a very, very big deal. It’s big because these legislators make decisions on which ultimately depend matters of great consequence, such as: security of elections; who gets prioritized in our health care system; how money moves; can hackers damage our electrical grid and  communications infrastructure, undermine our financial institutions, and even get control of nuclear weapons?  And so forth—pick your own menace.

(BTW If you value The Guardian’s coverage, as you should, please make a contribution; they still, amazingly, do not have a paywall.)

Content understood, tech not so much

The electoral antics of Cambridge Analytica were immoral and threatening to our political system. Our politicians will undoubtedly, now that the free press has partially pulled away the veils concealing Cambridge Analyticawill give Facebook and the data thieves a hammering. At least, the kind of anti-Clinton, voter-suppressing messages proliferating in 2016 can be interpreted shrewdly by politicians who are keen psychologists if nothing else. Understanding the effects of certain types of messages on susceptible minds is important.  Legislators can understand that, and devise ways to constrain and penalize companies, groups, and individuals, in familiar old legislative ways

But just as important as content is how these messages get propagated. No matter how many questions senators and congressmen ask—and they do, to their credit, ask—without a deep understanding of the technology, they’ll have great difficulty dealing with anti-fact propaganda that will inevitably slip past safety measures. Not to mention direct tampering with any and all of the many machines (e.g. financial institutions, power plants) connected to the internet.

The problem is much broader than electoral politics. Big Data algorithms are now at the core of decision making in finance, health care, criminal justice, education,  hiring and firing, you name it.

These matters are all discussed in a most excellent, short, and highly readable book, Weapons of Math Destruction, by Cathy O’Neil. O’Neil was once upon a time a “quant” (math and data science expert) at a Wall Street hedge fund who eventually bailed, joined “the 99 percent,” and is now engaged in a fight against the “Weapons” as she terms them.

Want to learn how using Big Data biases prison sentences, bank loans, teacher promotions and demotions, health benefits, etc.?  Check out Cathy O’Neil’s book.

The opacity problem: multilayered incomprehension

One of the most salient themes of O’Neil’s book is transparency versus opacity.*  Sure, she talks about intentional discrimination in many forms, such as rating  mortgage risks by zip code. But the larger issue is, that even when algorithms are written with the best of intentions, how the algorithms implement  those intentions is often inscrutable even to those who write the programs. That’s because (1) the algorithms and their interactions are extremely complex, and (2) the datasets used by the algorithms are often skewed and almost always incomplete.  It gets even more complex and more incomprehensible when you introduce machine learning, which is adding a whole new dimension to computing capability.

These processes are, O’Neil maintains, to a great extent, opaque.

Thus we have (at least) two layers of misunderstanding—decision-makers don’t get the tech, and maybe worse, the technical geniuses don’t even have full understanding of what goes on inside the machines.  And the opacity problem will continue to worsen as interdependent algorithms get ever more complex, datasets grow ever bigger, and machines keep learning faster and faster.

Recently, I have come across two reports of computer programs solving problems in ways that no human had thought of before, and—per the data scientists who analyzed the solutions—ways that a human would never think of. ** Smart-enough computers may be able to think outside of whatever boxes we construct.

And that, to me, is chilling.  We don’t even have to see machines reach the scary level of superintelligence*** to worry about how they  can fundamentally change our lives in ways that we don’t understand.  O’Neil points out that they already do, in many cases, and how they do it is often invisible. Alarmingly, decision-makers don’t understand the processes any better than most of the rest of us. Legislators are trying to figure out how to penalize and constrain companies and individuals. Meanwhile, out of sight, Big Data machines are gobbling and digesting data at terabytes per microsecond, and spitting out decisions whose bases are often inscrutable, with impacts at every level of contemporary life.

 

====================== footnotes follow ===================

* There’s a lot more to O’Neil’s book than the opacity/transparency issue, but I picked it as most relevant to the discussion at hand

 ** Sorry I can’t cite the instances; they appeared either in New Scientist, or Scientific American, and I don’t have time to chase them down.

 *** I discussed the dangers of Superintelligence (one of the most important  but least urgent issues of our day) in a sort-of review of Nick Bostrom’s book by that name, in a post to be found at Superintelligence warily considered.

One thought on ““They don’t understand how it works.” Information Technology and the Queasy Underbelly of Democracy”

  1. Its ⅼike yоu rеad my mind! Үou ѕeem to know ѕo mսch aЬout this,
    like you wrote the book in itt οr something.
    I think thatt ʏou couyld ԁօ ԝith a few pics to drive the message һome a bit,
    but othеr than that, this iss great blog.
    An excellent read. Ӏ wiⅼl dеfinitely be back.

Leave a Reply

Your email address will not be published. Required fields are marked *