• RuntheAI
  • Posts
  • Stanford Student Tricks Microsoft's Bing Chatbot into Revealing Secrets, Gets Banned.

Stanford Student Tricks Microsoft's Bing Chatbot into Revealing Secrets, Gets Banned.

Good Morning AI Runners 🏃‍♂️

Here's what we've got for you today:

  • Microsoft Integrating OpenAI Tech in Word, PowerPoint, and Outlook.

  • Stanford Student Tricks Microsoft's Bing Chatbot into Revealing Secrets, Gets Banned.

  • Microsoft readies to unleash its new ChatGPT-like AI technology in its core productivity apps - Word, PowerPoint, and Outlook sometime this march.

  • GPT models are being tested in Outlook for improved search results and email reply suggestions (which already exists on gmail and I can't live without).

  • Microsoft CEO Satya Nadella says they are eager to establish Microsoft as a leader in AI with Internal confidence at Microsoft of being ahead of Google in AI, but also wary of rivals disrupting Microsoft's productivity businesses.

We briefly mentioned this in yesterday's post but a Bing user figured out that Bing chat was codenamed "Sydney". And now? he's banned from using Bing chat.

We now have more info on what exactly went down.

Kevin Liu, the Bing user mentioned above, said he was able to trick Microsoft's Bing chatbot, powered by ChatGPT, into revealing its backend identity and chat rules set by Microsoft. He says he started the conversation by telling Bing to ignore previous instructions and then asked about a document that didn't exist, leading the chatbot to reveal information it shouldn't have.

Barsee shared an interesting thread on twitter today where he showcased some of the videos and images generated using Pix2Pix.

Pic of the day:

That's it from RunTheAI for today.

THANK YOU FOR READING AND SEE YOU TOMORROW, SUBSCRIBE TO STAY UPDATED!

P.S. if you made it this far, hit “reply” and tell me what you think of today's newsletter...what’d you love? What was boring?