Full description not available
J**I
A catchy title…
…and a book that delivers by delving into the numerous ways that mathematical algorithms impact our lives, sometimes very negatively and often without appeal. All too often the ones who formulate the algorithms view the injustices as just so much collateral damage, sacrificed on the altar of economic efficiency. At the end of the review, I’ll provide an example of my own.Cathy O’Neil has a PhD in math from Harvard, taught at Barnard, decided to make three times the money by working as a “quant” on Wall Street, specifically for the hedge fund D. E. Shaw. Of the numerous wry observations she makes in the book, she compares working at D.E. Shaw to the structure of Al Qaeda. Information was tightly controlled in individual “cells.” No one (probably even the big boys) understood the entire structure which prevented someone “walking” to a rival. The financial meltdown of 2008, when suddenly the quants, and others, realized that a strawberry picker named Alberto Ramirez, making $14,000 a year, really couldn’t afford the $720,000 he financed in Rancho Grande, CA,, and therefore the “Triple A” rating on the bonds issued based on the mortgage was phony, proved to be her “Saul on the road to Damascus moment,” which eventually led to this book. (She doesn’t make the point that the damage done by the quants, in terms of lost homes and jobs, to so many Americans, was far, far greater than Al Qaeda’s wildest aspirations.)In her book, O’Neil goes far beyond Wall Street to other segments of our society: colleges, the judicial system, insurance, advertising, employment, teacher evaluations, credit scores, and political campaigns and Facebook.Consider colleges. It was US News and World Report that dreamed up the idea of ranking colleges based on “objective” quantitative criteria. They convinced others to play along, in particular the colleges themselves. And so, from the perspective of a university President, “…they were at the summit of their careers dedicating enormous energy toward boosting performance in fifteen areas defined by a group of journalists at a second-tier news magazine.” A most important area was totally omitted: “value for money,” a standard criteria for most Amazon Vine reviews. And so, as she says, to meet these journalists’ criteria, the cost of higher education rose 500% between 1985 and 2013. She cites a couple of examples how colleges “gamed” the system. The most interesting was King Abdulaziz University in Saudi Arabia. Its math department had been around TWO years, in 2014, when it came in 7th place in the world, behind Harvard, but ahead of MIT and Cambridge! How? It searched the professional journals for professors with the most citations, one of the criteria in the algorithm, offered the professors $72,000 a year for three weeks of work as “adjunct faculty.” Voila.In public school teacher evaluations in the USA, O’Neil cites the example of a well-respected teacher who was fired for being in the bottom 10% in teacher evaluations. How? Apparently the teachers from the PREVIOUS year had falsified the students’ standardized testing results. The following year, when the well-respected teacher did not, it appeared to the algorithm that the students had declined. No appeal or common sense. She was fired.Insurance is a personal bugaboo with me. O’Neil confirmed what I learned the hard way. A MAJOR factor in determining the price of insurance is an algorithm that determines which customers are unlikely to switch insurance companies – and those customers are charged the most! When I finally figured this out, the hard way, a few years back, the company that famously proclaims that you can “save 15% or more” was actually willing to drop my insurance premium 30% because I was changing, which I still did, to another company that offered the same coverage for 50% less. (I’ll be changing from that company in a couple of years, of course.) (What a racket.)Another fascinating section is on how our on-line behavior is monitored, which changes not only the ads we see, but the very news. And how much effort is expended in political campaigns on those few undecided voters in Florida and Ohio. Wow. Truly calls for the abolition of the Electoral College.Finally, my own example. I once worked for the COO of the most famous hospital in the aforementioned Saudi Arabia. He called me in one day and asked if I could do standard deviations. Thanks to Bill Gates, et al., I assured him I could readily do them. “Then please do them on all the doctors’ salaries, per department”. Again, thanks to Bill, it was done in a day. Why, oh why? It was the COO’s own “algorithm.” When he met monthly with each department Chair, to discuss physician evaluations and salary increases, there would be nothing “personal” involved. He could point to this objective report, and express his concerns about the “standard deviation” of the salaries within the department. And depending on – hum – the circumstances, he could say: I think the standard deviation is “too high” (or, of course, “too low”). The “basis” for giving out a 2 ½% or 5% salary increase. “Clever.”As for O’Neil’s book, 5-stars, plus.
D**D
Very clear, but over-reliant on government solutions instead of more choices for consumers (competition!)
I was excited to read this book as soon as I heard Cathy O'Neill, the author, interviewed on EconTalk.O'Neill's hypothesis is that algorithms and machine learning can be useful, but they can also be destructive if they are (1) opaque, (2) scalable and (3) damaging. Put differently, an algorithm that determines whether you should be hired or fired, given a loan or able to retire on your savings is a WMD if it is opaque to users, "beneficiaries" and the public, has an impact on a large group of people at once, and "makes decisions" that have large social, financial or legal impacts. WMDs can leave thousands in jail or bankrupt pensions, often without warning or remorse.As examples of non-WMDs, consider bitcoin/blockchain (the code and transactions are published), algorithms developed by a teacher (small scale), and Amazon's "recommended" lists, which are not damaging (because customers can decide to buy or not).As examples of WMDs (many of which are explained in the book), consider Facebook's "newsfeed" algorithm, which is opaque (based on their internal advertising model), scaled (1.9 billion disenfranchised zombies) and damaging (echo-chamber, anyone?)I took numerous notes while reading this book, which I think everyone interested in the rising power of "big data" (or big brother) or bureaucratic processes should read, but I will only highlight a few:* Models are imperfect -- and dangerous if they are given too much "authority" (as I've said)* Good systems use feedback to improve in transparent ways (they are anti-WMDs)WMDs punish the poor because the rich can afford "custom" systems that are additionally mediated by professionals (lawyers, accountants, teachers)* Models are more dangerous the more removed their data are from the topic of interest, e.g., models of "teacher effectiveness" based on "student grades" (or worse alumni salaries)* "Models are opinions embedded in mathematics" (what I said) which means that those weak in math will suffer more. That matters when "American adults... are literally the worst [at solving digital problems] in the developed world."* It is easy for a "neutral" variable (e.g., postal code) to reproduce a biased variable (e.g., race)* Wall Street is excellent at scaling up a bad idea, leading to huge financial losses (and taxpayer bailouts). It was not an accident that Wall Street "messed up." They knew that profits were private but losses social.* Many for-profit colleges use online advertisements to attract (and rip off) the most vulnerable -- leaving them in debt and/or taxpayers with the bill. Sad.* A good program (for education or crime prevention) also relies on qualitative factors that are hard to code into algorithms. Ignore those and you're likely to get a biased WMD. I just saw a documentary on urbanism that asked "what do the poor want -- hot water or a bathtub?" They wanted a bathtub because they had never had one and could not afford to heat water. #checkyourbias* At some points in this book, I disagreed with O'Neill's preference for justice over efficiency. She does not want to allow employers to look at job applicants' credit histories because "hardworking people might lose jobs." Yes, that's true, but I can see why employers are willing to lose a few good people to avoid a lot of bad people, especially if they have lots of remaining (good credit) applicants. Should this happen at the government level? Perhaps not, but I don't see why a hotel chain cannot do this: the scale is too small to be a WMD.* I did, OTOH, notice that peer-to-peer lending might be biased against lender like me (I use Lending Club, which sucks) who rely on their "public credit models" as it seems that these models are badly calibrated, leaving retail suckers like me to lose money while institutional borrowers are given preferential access.* O'Neill's worries about injustice go a little too far in her counterexamples of the "safe driver who needs to drive through a dangerous neighborhood at 2am" as not deserving to face higher insurance prices, etc. I agree that this person may deserve a break, but the solution to this "unfair pricing" is not a ban on such price discrimination but an increase in competition, which has a way of separating safe and unsafe drivers (it's called a "separating equilibrium" in economics). Her fear of injustice makes me think that she's perhaps missing the point. High driving insurance rates are not a blow against human rights, even if they capture an imperfect measure of risk, because driving itself is not a human right. Yes, I know it's tough to live without a car in many parts of the US, but people suffering in those circumstances need to think bigger about maybe moving to a better place.* Worried about bias in advertisements? Just ban all of them.* O'Neill occasionally makes some false claims, e.g., that US employers offered health insurance as a perk to attract scarce workers during WWII. That was mainly because of a government-ordered wage freeze that incentivised firms to offer "more money" via perks. In any case, it would be good to look at how other countries run their health systems (I love the Dutch system) before blaming all US failures on WMDs.* I'm sympathetic to the lies and distortions that Facebook and other social media spread (with the help of WMDs), but I've gotta give Trump credit for blowing up all the careful attempts to corral, control and manipulate what people see or think (but maybe he had a better way to manipulate). Trump has shown that people are willing to ignore facts to the point where it might take a real WMD blowing up in their neighborhood to take them off auto pilot.* When it comes to political manipulations, I worry less about WMDs than the total lack of competition due to gerrymandering. In the 2016 election, 97 percent of representatives were re-elected to the House.* Yes, I agree that humans are better at finding and using nuances, but those will be overshadowed as long as there's a profit (or election) to win. * * * Can we push back on those problems? Yes, if we realize how our phones are tracking us, how GPA is not your career, or how "the old boys network" actually produced a useful mix of perspectives.* Businesses will be especially quick to temper their enthusiasm when they notice that WMDs are not nearly so clever. What worries me more are politicians or bureaucrats who believe a salesman pitching a WMD that will save them time but harm citizens. That's how we got dumb do not fly lists, and other assorted government failures.* Although I do not put as much faith in "government regulation" as a solution to this problem as I put into competition, I agree with O'Neill that consumers should own their data and companies only get access to it on an opt-in model, but that model will be broken for as long as the EULA requires that you give up lots of data in exchange for access to the "free" platform. Yes, Facebook is handy, but do you want Facebook listening to your phone all the time?Bottom Line: I give this book FOUR STARS for its well written, enlightening expose of MWDs. I would have preferred less emphasis on bureaucratic solutions and more on market, competition, and property rights solutions.
E**A
Good insight into the dark side of mathematical models used for decision-making
This book has vivid and detailed examples of the hidden impact of mathematical models on people’s lives, and how these models often target vulnerable populations at their expense. I highly recommend this book.
Trustpilot
1 day ago
2 months ago