Why did Taylor Swift want to sue Microsoft? President of the tech giant makes the shocking revelation in his new book

Taylor Swift had had plans to sue Microsoft for their racist chatbot.

Taylor Swift claims ownership over many things, wanting to trademark everything that has ever been remotely associated with her.  The successful American singer has trademarked her birth year, 1989, and lyrics from her bestselling songs.

Now, a new update has hit the blog that the American music sensation wanted to sue technology giant, Microsoft over their racist AL Chatbot. However, it wasn’t about the racist activities that the artificial intelligence chatbot was indulged in online, but it was for its name, Tay.

Taylor threatened to sue Microsoft

The President of Microsoft, Brad Smith in his new book, Tools and Weapons, has made this shocking revelation. In an excerpt from his book, he alleges that he had received a legal notice from Taylor Swift’s attorney while he was on his vacation.

He recalls that an e-mail had arrived from Beverly Hills Lawyer who introduced himself as Taylor Swift’s legal representative. Reflecting upon the e-mail, he writes that the e-mail said that ‘the name Tay, as he is sure Brad must know, is closely associated with his client, Swift. He further states that Swift’s lawyer went on to argue that the use of the name Tay created a false and misleading association between the American singer and their chatbot and that it breached federal and state laws.

Brad clarifies that he had no inkling about the ownership of the term ‘Tay’ with Taylor Swift.

What is Tay?

Tay or TayTweets was an artificial intelligence chatterbot designed by Microsoft for the Chinese market as XiaoIce. It was designed for the U.S.A market with the name ‘Tay’. The chatbot was constructed to learn from social media and have conversations on social platforms with teenagers and young adults.

However, shortly after the launch, it started tweeting racist comments and conspiracy theories, something it picked up from pranksters online. Microsoft pulled the Bot off within 18 hours and issued an apology for the inflammatory comments made by the chatterbot.