Elon Musk and linguists say that AI is forcing us to confront the limits of human language

Speak no evil.
Speak no evil.
Image: Reuters/Arko Datta
We may earn a commission from links on this page.

In analytic philosophy, any meaning can be expressed in language. In his book Expression and Meaning (1979), UC Berkeley philosopher John Searle calls this idea “the principle of expressibility, the principle that whatever can be meant can be said”. Moreover, in the Tractatus Logico-Philosophicus (1921), Ludwig Wittgenstein suggests that “the limits of my language mean the limits of my world”.

Outside the hermetically sealed field of analytic philosophy, the limits of natural language when it comes to meaning-making have long been recognized in both the arts and sciences. Psychology and linguistics acknowledge that language is not a perfect medium. It is generally accepted that much of our thought is non-verbal, and at least some of it might be inexpressible in language. Notably, language often cannot express the concrete experiences engendered by contemporary art and fails to formulate the kind of abstract thought characteristic of much modern science. Language is not a flawless vehicle for conveying thought and feelings.

In the field of artificial intelligence, technology can be incomprehensible even to experts. In the essay “Is Artificial Intelligence Permanently Inscrutable?” Princeton neuroscientist Aaron Bornstein discusses this problem with regard to artificial neural networks (computational models): “Nobody knows quite how they work. And that means no one can predict when they might fail.” This could harm people if, for example, doctors relied on this technology to assess whether patients might develop complications.

Bornstein says organizations sometimes choose less efficient but more transparent tools for data analysis and “even governments are starting to show concern about the increasing influence of inscrutable neural-network oracles.” He suggests that “the requirement for interpretability can be seen as another set of constraints, preventing a model from a ‘pure’ solution that pays attention only to the input and output data it is given, and potentially reducing accuracy.” The mind is a limitation for artificial intelligence: “Interpretability could keep such models from reaching their full potential.” Since the work of such technology cannot be fully understood, it is virtually impossible to explain in language.

Ryota Kanai, neuroscientist and CEO of Araya, a Tokyo-based startup, acknowledges that “given the complexity of contemporary neural networks, we have trouble discerning how AIs produce decisions, much less translating the process into a language humans can make sense of.” To that end, Kanai and his colleagues are “trying to implement metacognition in neural networks so that they can communicate their internal states.”

Their ambition is to give a voice to the machine: “We want our machines to explain how and why they do what they do.” This form of communication is to be developed by the machines themselves. With this feedback, researchers will serve as translators who can explain to the public decisions made by the machines. As for human language, Kanai refers to it as “the additional difficulty of teaching AIs to express themselves.” (Incidentally, this assumes that computational models have “selves.”) Language is a challenge for artificial intelligence.

Elon Musk advances the idea ‘”that we should augment the slow, imprecise communication of our voices with a direct brain-to-computer linkup.” He has founded the company Neuralink that will allegedly connect people to the network in which they will exchange thoughts without wasting their time and energy on language. As Christopher Markou, Cambridge PhD candidate at the Faculty of Law describes it in his essay for The Conversation, “it would enable us to share our thoughts, fears, hopes, and anxieties without demeaning ourselves with written or spoken language”.

Tim Urban, blogger and cartoonist at Wait But Why, presents Musk’s vision of thought communication and argues that “when you consider the ‘lost in transmission’ phenomenon that happens with language, you realize how much more effective group thinking would be.” This project makes sinister assumptions: Instead of enhancing verbal communication, Musk suggests abandoning it as an inadequate means of social interaction. People generally appreciate improvement of the communication networks that transmit language, but instead, they are offered a corporate utopian future of techno-telepathy and an eerily dystopian present where language is an impediment to cooperation. It is both ironic and reassuring that such criticism of language can be successfully communicated by language.

In his recent essay ”The Kekulé Problem,” American writer Cormac McCarthy discusses the origins of language and is skeptical about its fundamental role in cognition: “Problems, in general, are often well posed in terms of language and language remains a handy tool for explaining them. But the actual process of thinking—in any discipline—is largely an unconscious affair.” He defines the unconscious as “a machine for operating an animal.”

McCarthy regards language as a relatively recent invention and compares it to a virus that rapidly spread among humans about a hundred thousand years ago. His vision of language is unsatisfactory for a number of reasons. First, language is a human faculty developed due to the gradual evolution of communication; it is problematic to conceive of it as a virus or the result of a sudden invention. Second, thought does not need to be unconscious to be non-verbal. Much conscious thought does not rely on language. Finally, humans may be facing problems that are difficult to convey through language. This might be the key challenge for both the arts and sciences in the immediate future.

While language may not be a perfect medium for thought, it is the most important means of communication that makes possible modern societies, institutions, states, and cultures. Its resourcefulness allows humans to establish social relationships and design new forms of cooperation. It is a robust and highly optimized form of communication, developed through gradual change. For thousands of years, language has been a tool for social interaction. This interaction is facing existential threats (authoritarianism, isolationism, conflict) because the subjective experiences (think of the limits of empathy when it comes to migrants) and the knowledge (think of the complexity of global warming) that are engaged in the arts and sciences appear to have gone beyond the expressive power of language.

Humanity depends on the capacity of language to communicate complex, new ideas and thus integrate them into culture. If people fail to understand and discuss emerging global problems, they will not be able to address them in solidarity with one another. In his essay ”Our World Outsmarts Us” for Aeon, Robert Burton, the former associate director of the department of neurosciences at the UCSF Medical Center at Mt Zion, highlights this conundrum when he asks: “If we are not up to the cognitive task, how might we be expected to respond?” Individuals alone cannot stop climate change or curb the rising inequality of income distribution. These goals can only be achieved by concerted efforts. To work together, people need language.

In the arts, it is felt that subjective experiences are not always transmittable by language. Artists confront the limits of concrete expression. Scientists, in their turn, understand that language is a crude tool incapable of conveying abstract ideas. Science thus probes the limits of abstract thought. Both the arts and sciences are dissatisfied with verbal communication. To induce wonder, artists may forego language. To obtain knowledge, scientists often leave language behind.

In his aptly titled essay ”Science Has Outgrown the Human Mind and Its Limited Capacities,” Ahmed Alkhateeb, a molecular cancer biologist at Harvard Medical School, suggests outsourcing research to artificial intelligence because “human minds simply cannot reconstruct highly complex natural phenomena efficiently enough in the age of big data.” The problem is that language is a tool for the gathering of knowledge and appreciation of beauty by the whole society.

Abandoning language marginalizes the arts and sciences. Wonder and knowledge become inaccessible for the community at large. When people make decisions about the future, political processes may fail to register what is happening at the forefront of human thought. Without language, the arts and sciences lose cultural significance and political clout: There is less hope for the arts to move people’s hearts and less opportunity for sciences to enlighten the public. With the arts and sciences on the margins, humanity undermines its cultural safeguards. Today’s dominant narratives foreground the progress of science and the democratization of art, but global challenges necessitate an even more active engagement with scientific, moral, and aesthetic dilemmas on the part of humanity. Language is one of the key tools that can realize this ambition.

It is important to strike a balance between pushing the limits of language and using it as a tool to communicate and collaborate. Artists and scientists might approach the public with ideas that cannot be easily understood and yet need to be conveyed by language. In his essay ”To Fix the Climate, Tell Better Stories,” Michael Segal, editor in chief at Nautilus, argues that science needs narratives to become culture. He posits that narratives can help humanity solve global problems. This potential is revealed to us if we look at how “indigenous peoples around the world tell myths which contain warning signs for natural disasters.” Today people can construct helpful narratives based on an expert understanding of the world. These stories can relate unfathomable dangers to the frail human body, and language is the best political vehicle for this task.

In his 2017 New York Times bestseller On Tyranny, Yale historian Timothy Snyder, for example, draws from the history of the 20th century to relate the rise of authoritarian regimes to concrete threats to human life, encouraging his readers to stand up to tyranny. He asks them to take responsibility for the face of the world, defend institutions, remember professional ethics, believe in truth, and challenge the status quo. His language is powerful and clear. Such narratives can help address complex social and environmental problems by using human-scale categories of language.

Ultimately, the arts and sciences grasp critically important knowledge and engage significant experiences, but often fail to express them in language. As Wittgenstein says, “whereof one cannot speak, thereof one must be silent.” This silence might lead to dire consequences for humanity. It is crucial to break the silence. The arts and sciences need to talk to the public and to advance language and culture.