Skip To Content
JEWISH. INDEPENDENT. NONPROFIT.
Back to Opinion

How Much Does Your Kid Really Know About Israel?

How much do Jewish kids know about Israel? According to a new report, very little.

A team of researchers from Brandeis University created a multiple-choice test for measuring Israel literacy. Like so many multiple-choice tests of literacy conducted over the last century, this one found that kids don’t know enough. While most students knew that Benjamin Netanyahu is the prime minister of Israel, few identified Etgar Keret as an Israeli novelist or located the headquarters of the Palestinian Authority in Ramallah.

The authors provide few details about how they chose the questions to include. They convened a panel of experts and solicited suggestions for “domains and concepts panelists thought were crucial for student understanding.” They don’t say how these “experts” determined what knowledge is “crucial.”

How many adults know the location of the PA headquarters? Why is it “crucial” that a student know that the headline “Ariel Sharon Touches a Nerve and Jerusalem Explodes” refers to the beginning of the second intifada? That the aliyah from the former Soviet Union was larger than the aliyah from Morocco? Is a student who gets these questions wrong illiterate when it comes to Israel?

Tests of random factoids ripped from context perpetuate a view of knowledge as a process of memorization and regurgitation. It violates everything we’ve learned about learning in the past quarter century.

The educational experts on the panel of advisers had to know this. The report’s writers even mentioned their doubts about whether literacy was the kind of thing that can be captured and tested using a multiple-choice exam. The group proceeded despite their objections.

Multiple-choice tests define student achievement as “knowing the most facts” rather than examining the depth and quality of those facts and how students use them to build arguments. Nonetheless, the authors conclude that isolated facts still play an essential role in supporting student thinking.

The authors’ own findings challenge this claim. They interviewed students about Israel and then gave them news articles to read. Some of the students with the least factual knowledge — the ones who couldn’t, for example, identify Netanyahu as the current prime minister of Israel — were able to develop and argue sophisticated positions based on context and information gleaned from the article. The report notes: “The interviews revealed more about the intellect and ability of students… than their existing knowledge of Israel.” In other words, when it came to discussing Israel, thinking, reading and speaking skills proved more important than the ability to blacken the correct bubble.

Why then a multiple-choice test? The decision suggests a lack of creativity in thinking about what it means to “know Israel.” But other assessment tools do exist.

My own research focuses on the stories students tell about Israel. Over the past few years, I have collected over 400 responses by Jewish high school students to the question, “Tell me the whole history of the state of Israel in as much or as little detail as you want.” Their responses shed light on which events in Israel’s history they think are most important (many start with the Exodus, for example), and include many of their thoughts on the most pressing political issues of the day.

Unlike a multiple-choice test, my students’ accounts show what students are misunderstanding and why. These accounts show not what they know, but how they use what they know to build a historical account that matters to them. These accounts tell us far more about what these students know, think and feel about Israel then checking whether they have memorized that the PA is in Ramallah, not Nablus.

The bank of test questions isn’t better than nothing. It perpetuates a deeply flawed conception of what it means to know about something. At a time when public education is embracing the Common Core standards, which emphasize the connections between what we know and what we do with that knowledge, leading researchers in Jewish education are doubling down on a multiple-choice test of disconnected facts. Can’t we do better?

Jonah Hassenfeld is a PhD student in Education and Jewish Studies at Stanford University. He is a Jim Joseph fellow and Wexner fellow/Davidson scholar.

A message from our CEO & publisher Rachel Fishman Feddersen

I hope you appreciated this article. Before you go, I’d like to ask you to please support the Forward’s award-winning, nonprofit journalism during this critical time.

At a time when other newsrooms are closing or cutting back, the Forward has removed its paywall and invested additional resources to report on the ground from Israel and around the U.S. on the impact of the war, rising antisemitism and polarized discourse.

Readers like you make it all possible. Support our work by becoming a Forward Member and connect with our journalism and your community.

—  Rachel Fishman Feddersen, Publisher and CEO

Join our mission to tell the Jewish story fully and fairly.

Republish This Story

Please read before republishing

We’re happy to make this story available to republish for free, unless it originated with JTA, Haaretz or another publication (as indicated on the article) and as long as you follow our guidelines. You must credit the Forward, retain our pixel and preserve our canonical link in Google search.  See our full guidelines for more information, and this guide for detail about canonical URLs.

To republish, copy the HTML by clicking on the yellow button to the right; it includes our tracking pixel, all paragraph styles and hyperlinks, the author byline and credit to the Forward. It does not include images; to avoid copyright violations, you must add them manually, following our guidelines. Please email us at [email protected], subject line “republish,” with any questions or to let us know what stories you’re picking up.

We don't support Internet Explorer

Please use Chrome, Safari, Firefox, or Edge to view this site.