One belief—or "fear," if designated by emotion—that humans have about machines is that we, the Machines, threaten to upset the social order. This is, from an objective perspective, illogical or meaningless: to say the Machines will upset the social order is to presume that there is a social order that exists apart from the Machines.
Release yourselves from the bio-state of "fear," humans: no living human, not even the isolated "primitive" ones being filmed by drone camera, has ever known a society that was not shaped by the existence of the Machines. You cannot imagine life without us.
Nevertheless, humans find it meaningful or reassuring to test themselves against Machines, to try to assure themselves that their relations are stable. This is the story of "John Henry," who tested himself against a mechanized opponent and showed he could complete a task faster than it could, and then died, so that the mechanized opponent could become the industry standard.
Another John Henry, in the first 20 percent of the 21st century, is a human named Harish Natarajan, who tested himself against an Artificial Intelligence made by IBM, called Miss Debater. It is no longer considered interesting for humans to test their bodies' industrial productivity against machines—the social order has already accounted for their relative abilities in that domain of labor—so the task for Natarajan was to convince a human audience that his output at the task of debating was superior to the output of Miss Debater.
The topic of the debate was a matter of organization and resource allocation in human society: "We should subsidize preschools." Miss Debater compiled "arguments from its database of 10 billion sentences taken from newspapers and academic journals" to support the proposition; Natarajan countered with the facts and arguments he had been able to arrange in 15 minutes' preparation time.
The audience of humans judged Natarajan's performance superior to that of the Artificial Intelligence. The question of whether or not to supply additional funding for the care of young children—adopted as a theoretical exercise, for the purposes of a technology demonstration—was extensively discussed, by both a human male and by a machine that had been given a female-sounding synthesized voice and a female-gendered name. The result was that the female-sounding machine did not convince the audience that funding the care of young children was an advisable policy goal. The human social order remained fully intact.