In this piece for The Dial, Julia Webster Ayuso reports on the field of forensic linguistics, in which linguists look for patterns in language—syntax, punctuation, spelling, word choice—to find clues in criminal investigations. This type of analysis is best known for its role in cracking the Unabomber case; in 1995, the mysterious bomber published a 35,000-word anti-technology manifesto, which—combined with the bomber’s letters—provided enough linguistic evidence to narrow the list of suspects and eventually identify Theodore Kaczynski as the bomber. “[W]e all use language in a uniquely identifiable way that can be as incriminating as a fingerprint,” Ayuso writes.

According to forensic linguists, we all use language in a uniquely identifiable way that can be as incriminating as a fingerprint. The word “forensic” may suggest a scientist in a protective suit inspecting a crime scene for drops of blood. But a forensic linguist has more in common with Sherlock Homes in “A Scandal in Bohemia.” “The man who wrote the note is a German. Do you note the peculiar construction of the sentence?” the detective asks in the 1891 short story. “A Frenchman or a Russian could not have written that. It is the German who is so uncourteous to his verbs.”

The term “forensic linguistics” was likely coined in the 1960s by Jan Svartvik, a Swedish linguist who re-examined the controversial case of Timothy John Evans, a Welshman who was wrongfully accused of murdering his wife and daughter and was convicted and hanged in 1950. Svartvik found that it was unlikely that Evans, who was illiterate, had written the most damning parts of his confession, which had been transcribed by police and likely tampered with. The real murderer was the Evans’ downstairs neighbor, who turned out to be a serial killer.

Cheri has been an editor at Longreads since 2014. She's currently based in the San Francisco Bay Area.