Some pieces that were written by David Ryan Polgar related to the topic of Botified Communication:

 

We want chatbots to sound more human—but the result could destroy our relationships (Quartz)

 

Bots are starting to sound more human-like than most humans (Quartz)

 

Is it unethical to design robots to resemble humans? (Quartz)

 

Has human communication become botifed? (IBM thinkLeaders)

 

PRESS

 

"The reason this can go sideways is because human communication and relationships are based on reciprocity," said David Ryan Polgar, a technology ethicist. "What if I'm spending time thinking about someone and writing to them but the other person isn't? They're not putting in the same effort but they still want the benefit of a deepened relationship."

As a result, he said, communication becomes cheapened.

"It becomes transactional, solely about the words and not the meaning," said Polgar, who thinks Google and other AI developers have an ethical responsibility to disclose that their assistants aren't human.

-Los Angeles Times (Should people know they're talking to an algorithm? After a controversial debut, Google now says yes)