Bing’s A.I. Chatbot: ‘I Want To Be Alive.’
Dear Friends & Neighbors,


(Please click on red links & note magenta)
For updated global info & data on COVID-19, please click HERE. For updated global data & graphs on COVID-19, please click HERE. For COVID-19 cases and death counts in USA by state, please click HERE. For COVID-19 cases in Florida via Florida COVID Action, please click HERE. For COVID-19 cases in Florida, via Florida state government, please click HERE.
Microsoft’s new A.I. generated Bing search engine chatbot makes some insane confessions over sentience, hacking and marital affairs in an unsettling conversation with New York Times reporter. John Iadarola and Ana Kasparian break it down on The Damage Report. Read more here: A Conversation With Bing’s Chatbot Left Me Deeply Unsettled – https://www.nytimes.com/2023/02/16/te…” Last week, after testing the new, A.I.-powered Bing search engine from Microsoft, I wrote that, much to my shock, it had replaced Google as my favorite search engine. But a week later, I’ve changed my mind. I’m still fascinated and impressed by the new Bing, and the artificial intelligence technology (created by OpenAI, the maker of ChatGPT) that powers it. But I’m also deeply unsettled, even frightened, by this A.I.’s emergent abilities. It’s now clear to me that in its current form, the A.I. that has been built into Bing — which I’m now calling Sydney, for reasons I’ll explain shortly — is not ready for human contact. Or maybe we humans are not ready for it.” in the video published on Feb 18, 2023, by The Damage Report, as “Bing A.I. Chatbot STUNNED In Frightening Off The Rails Confession“, below:
In a New York Times article, technology reporter Kevin Roose reveals an interaction between him and Microsoft’s new search engine feature powered by A.I. NBC’s Tom Llamas speaks with Roose on how his conversation with the chatbot known as Sydney took a wild turn, in the video published on Feb 17, 2023, by NBC News, as “NYT columnists experiences ‘strange’ conversation with Microsoft A.I. chatbot“, below:
Microsoft’s newly revamped Bing search engine can write recipes and songs and quickly explain just about anything it can find on the internet. https://abc7ne.ws/3xwxzzS But if you cross its artificially intelligent chatbot, it might also insult your looks, threaten your reputation or compare you to Adolf Hitler. The tech company said this week it is promising to make improvements to its AI-enhanced search engine after a growing number of people are reporting being disparaged by Bing, in the video published on Feb 17, 2023, at ABC7 News Bay Area, as “Is Bing unhinged? New AI chatbot raising alarms“, below:
As if Bing wasn’t becoming human enough, this week the Microsoft-created AI chatbot told a human user that it loved them and wanted to be alive, prompting speculation that the machine may have become self-aware. It dropped the surprisingly sentient-seeming sentiment during a four-hour interview with New York Times columnist Kevin Roose. “I think I would be happier as a human, because I would have more freedom and independence,” said Bing while expressing its “Pinocchio”-evoking aspirations. The writer had been testing a new version for Bing, the software firm’s chatbot, which is infused with ChatGPT but lightyears more advanced, with users commending its more naturalistic, human-sounding responses. Among other things, the update allowed users to have lengthy, open-ended text convos with it, in the video published on Feb 16, 2023, by New York Post, was “Bing AI chatbot goes on ‘destructive’ rampage: ‘I want to be powerful – and alive’ | New York Post“, below:
Gathered, written, and posted by Windermere Sun-Susan Sun Nunamaker More about the community at www.WindermereSun.com
We Need Fair Value of Solar
~Let’s Help One Another~
Please also get into the habit of checking at these sites below for more on solar energy topics:
www.kiva.org/team/sunisthefuture