The Numbers Game
An interview with sociologist Matt Boxer on what we can and can’t learn from new statistics from the ADL and AJC.
Anti-Defamation League CEO Jonathan Greenblatt speaking at the ADL National Leadership Summit in Washington, DC on June 4th, 2019.
On Tuesday, as it has every year since 1979, the Anti-Defamation League (ADL) published its annual audit of antisemitic incidents, in which the organization attempts to run the numbers on reported antisemitism in the past year across the US and to discern trends. The main takeaway was grim: the ADL said that this year’s 2,712 incidents marked an all-time high—a 34% rise over last year’s total. CEO Jonathan Greenblatt said in his statement that “we do know that Jews are experiencing more antisemitic incidents than we have in this country in at least 40 years.” But, as I’ve written, antisemitic incidents—and bias-motivated incidents in general—are notoriously challenging to track accurately, and changes in the ADL’s methodology over time can make it hard to compare different years side by side. More numbers could mean that more antisemitic incidents are definitely taking place, or that the ADL has improved and expanded its reporting procedures. When perusing the report, I found myself wondering how much the data can really tell us about long-term trends in antisemitic incidents.
Also this week, the American Jewish Committee (AJC) released the results of a survey of American and Israeli Jewish millennials’ (aged 25-40) opinions on Israel, campus politics, and Israel-diaspora relations. Even as the ADL was raising the alarm about antisemitism, the AJC results suggested that a clear majority of American Jewish millennials is unaffected by or unconcerned with a perceived “anti-Israel climate” on college campuses—this despite all the resources continually poured into fighting campus political battles on Israel, allegedly for the sake of Jewish students. The survey also had some interesting results about American Jewish millennials’ opinions on ideal political outcomes in Israel/Palestine. It found 52.3% of the 800 respondents believe a solution is possible, and of those, 47% support a two-state solution, 22.5% support one binational democatic state, and 15.1% support “Israeli annexation of the West Bank leading to an extension of Israeli sovereignty, but in which Palestinians have a unique civil status and are represented by Palestinian municipal leaders.” (In response, my colleague Peter Beinart tweeted, “I’d love to have been a fly on the wall at the meetings where [the AJC] decided on ‘unique civil status’ as its preferred euphemism for apartheid.”) Jewish Currents has also reported on the challenges of accurately surveying Jewish opinion on Israel and Zionism, since decisions about question wording can make a big difference in results.
When I have questions about data and survey methodology, I usually look to Matthew Boxer, a sociologist at Brandeis University with expertise in Jewish communal research. For this week’s newsletter (subscribe here), Boxer and I talked about the ADL antisemitism audit and the AJC survey, including the challenges and limits of trying to measure a rise in antisemitic incidents and what best practices might be for asking about Israel/Palestine on surveys.
Mari Cohen: How can we understand the methodology of these audits and what we can and can’t get from them?
Matthew Boxer: I should preface this by saying I think the ADL is an indispensable organization, and I’m not saying what they’ve done here is without value. But the reality is that their methodology excludes a lot of different kinds of incidents. They don’t count instances of discrimination like not being given accommodation to take off for Jewish holidays, unless it’s accompanied by harassment. They don’t include general expressions of bigotry unless they’re overtly antisemitic. They’d count someone saying “Jews will not replace us,” but not necessarily someone blaming everything on George Soros or Michael Bloomberg. They’re counting particular types of incidents that appear on their radar. For example, I tweeted about my public high school where at one point I was the only publicly identified Jew. It was not unusual for me to hear accusations that I personally nailed Jesus to the cross. That’s an antisemitic incident. How was the ADL ever going to hear about that unless I reported it? Things like that happen all the time all over the country. A couple of years ago, in one of their annual surveys of American Jewish opinion, the AJC found that around three-quarters of Jews who said they had experienced an antisemitic incident never reported it at any level. Most incidents don’t get reported anywhere, let alone to the ADL, and some of the incidents that might get reported aren’t necessarily being counted.
So this audit doesn’t necessarily mean that antisemitic incidents have increased since last year. It might mean that people did a better job of recording incidents this year, or that more incidents were sufficiently notorious and got enough press attention to appear on the ADL’s radar. There are a lot of reasons why those numbers might fluctuate that have nothing to do with increasing or decreasing antisemitism. An audit like this is never a good way to judge whether antisemitism is increasing or decreasing. The numbers might have increased because antisemites are bolder than they used to be. When the president of the United States talks about Nazis as “very fine people,” they’re suddenly less afraid of exposing what garbage human beings they are for all the world to see.
MC: So you’re saying the level of antisemitism remains the same, but the number of incidents might rise, if antisemites are more emboldened?
MB: They might be committing more incidents, or the incidents might be getting more severe. Either way, the same baseline level of antisemitism could become more of a problem. We live in such a polarized society that any little thing can turn into an incident. People are more reactive and likely to lash out than they were in the past, and in ways that are likelier to draw attention. To the ADL’s credit, they spell out their methodology and their documentation, so anyone reading it carefully should be able to come to the same conclusion that I do: This is not reflective of every incident, but it’s not supposed to be.
MC: But that seems to conflict with some of the marketing, and with Greenblatt’s statement that the audit shows antisemitism is at its highest level in 40 years.
MB: I think he’s trying to make it easier for people to understand. He might even believe that is 100% accurate; I’ve never spoken to him. I’m just saying it’s not that simple. We do these representative sample surveys of American Jewish communities, and one of the questions we commonly ask is whether respondents personally experienced antisemitism at any point in the past year. Based on the number of people who say yes, there are more incidents in some individual communities than the ADL reported nationally in any given year. For instance, just in Orlando and Kansas City, which between them have around 75,000 Jews, there were more incidents than the ADL reported in the audit for the entire country.
MC: The ADL has been touting a 34% rise in overall incidents, but they note that this year, for the first time, they engaged in partnerships with other Jewish organizations like Hillel International and the Secure Community Network to include incidents reported by those organizations. Eighteen percent of the incidents in the audit came from these new organizing partnerships that they hadn’t had before. What do you make of that?
MB: It’s a double-edged sword. On the one hand, having these new partnerships is going to get them more complete information. I like that the ADL is building these sorts of partnerships and collecting more data. The problem is that even though they’ve been careful to note that this has artificially inflated the numbers relative to previous years, most people looking at it just see the 34% increase. Unless you really understand how to interpret data, you’re probably going to mess up.
MC: Are there potentially better ways to track the actual number of antisemitic incidents?
MB: Some people would point to the hate crime statistics that the Department of Justice gathers every year, but I always say those have some of the same problems as the ADL numbers. The DOJ only counts particular types of incidents, and not all antisemitic incidents rise to the level of a crime, let alone a hate crime. But that doesn’t mean that they’re not antisemitic. Not every hate crime even gets reported. Not every agency that is legally obligated to report on hate crimes actually does so. With the representative sample surveys, a lot of it depends on how you ask the question. If you ask whether people were “victimized” by antisemitism, you tend to get lower numbers than if you ask if they “personally experienced” antisemitism, because psychologically there’s a difference between experiencing something and being victimized by it.
Pew has asked about specific kinds of incidents in their national surveys, and they get some different numbers because they’re asking something very specific. If you ask general questions, people might not remember something that happened to them, whereas if you ask them about something specific, it’s more likely to trigger a memory. But even with well-done, nationally representative surveys, you can get different estimates, partly based on how the sample was collected, partly based on how the question was asked, and partly based on timing—like if a big incident just happened, people are more likely to remember something that happened to them. You could do everything perfectly and still not be sure that you’ve measured it exactly right.
MC: Given those difficulties, is there a more useful question to ask to understand antisemitism in the US than whether it’s on the rise?
MB: When I used to lecture about antisemitism, I would tell my students that antisemitism in the contemporary US is simultaneously much worse and nowhere near as bad as we think it is. We tend to focus on sensational examples of antisemitism that are not necessarily representative of the typical experience of an American Jew.
One example is this panic about the status of Jewish students on college campuses, which are now seen as these horribly hostile places for Jews. Before the pandemic, at least once a week I’d have someone lecturing me that I just don’t understand what it’s like. I’m a professor—I work on a college campus, I spend all day there, I visit other college campuses. The people lecturing me typically haven’t set foot on one in 40 years, so what do they know? When you talk to Jewish students, many of them will tell you that they have experienced antisemitic incidents. That does not mean that the campus is a hostile place to Jews. It means that the campus is a place in the United States, which is a hostile place to minorities of all sorts. That’s the price you pay for being a minority in the US. That doesn’t make it okay, it’s just the reality. That’s not to say that there aren’t college campuses that are genuinely hostile places for Jews. There are some, but they’re the exception, not the rule.
MC: I also want to ask about the AJC survey of American and Israeli Jewish millennials that was just released. There tends to be a lot of debate over how questions about Israel are worded in these polls. What jumps out at you about those questions on this survey?
MB: They are pretty standard, the same kind of questions that have been asked about feeling a connection to Israel for decades. I’m not sure that “connection to Israel” means the same thing today that it did ten, 20, 30, or 40 years ago. There isn’t good data on that, but I suspect respondents understand the question differently now.
To me, connection does not necessarily mean support, and support doesn’t mean rubber-stamping everything Israel does. If my parents tell me they’re disappointed in me, that is support, because they want me to be a good person and to correct myself when I do something wrong. I don’t think everyone understands “support” or “connection” that way when they’re being asked about Israel. Some people might feel emotional attachment or a deep connection to the history, the religion, and the culture, but not to the policies of the Israeli government. Others might feel a religious imperative to be supportive of the Israeli government. We just don’t know.
About one in four respondents are saying that the anti-Israel sentiment “on campus or elsewhere” has forced them to hide their Jewish identity. We don’t even know how much that is on campus and how much is elsewhere. That’s one way I might have designed the survey differently. There are certain places where I will be very careful not to let my Magen David show, and there are other places where I just don’t care. That’s not a campus thing. The vast majority of campuses are just fine for Jews, especially ones where Jews are likely to be in large numbers. I’m a little more concerned about places where there aren’t many Jews, because large numbers give you protection you don’t have when you’re alone.
MC: What do you think about the wording of questions about respondents’ preferred particular potential outcomes in Israel/Palestine? For example, the use of the term “unique civil status” to describe what many observers would call “apartheid.”
MB: I want to study how survey respondents understand these questions. I’ve had conversations with a lot of Jewish organizations about including questions like these on their surveys, and many of them don’t want to because it’s so controversial. They don’t see what they could do with those questions programmatically without offending somebody, so they avoid asking the questions. And if you don’t ask the questions, you’re not going to get the data, you’re not going to know what people think, and you’re not going to understand how these attitudes are related to Jewish life in connection to Israel. So I like seeing the effort put in to include these questions. As a researcher, I would love to know if they did any cognitive testing on the wording, because it would help me understand their findings here a little bit better.
Some years ago, a colleague and I wrote a paper looking at surveys of American Jewish opinion on the 2015 Iran nuclear deal. In that context, we found a study of the general US population, where at first they surveyed respondents on their views on the deal, and then they presented vetted, factually accurate expert opinions for and against it. Presenting information that way, simply enough to understand but also nuanced, has a way of changing people’s views. I think that would be a really interesting exercise with regard to Israel/Palestine. It would be relatively easy to find expert opinion on a two-state solution versus a binational state versus Israeli sovereignty everywhere versus just maintaining the status quo; you can find experts who hold each of these opinions, you can find a decent cross section of the population, and you can see the effect of getting expert information rather than just leaving people to rely on their own biases.