ozymandias99's avatar
ozymandias99

Aug. 1, 2022

0
LaMDA

Blake Lemoine worked on Google as an engineer for more than 7 years. His working fields included topics such as proactive search, personalization algorithms and responsible artificial intelligence. Recently he was fired from Google given that he released private information regarding a Google intelligent chatbot, which is named LaMDA.

Lemoine shared publicly a conversation he held with this bot. This exchange led him to believe that LaMDA is sentient. This statement conveys a deep meaning about the chatbot. It implies that LaMDA has its own thoughts and feelings. Lemoine even dared to state that this intelligent agent seems to replicate a 7-year-old who happens to know about physics. These affirmations earned Lemoine a paid administrative leave and eventually got him fired from Google.

After Lemoine's agressive actions, Google has resorted to mentioning that there is no evidence to support his claims. The company asserts that artificial neural networks rely on pattern recognition, not wit or human emotions. According to them, today's machine learning models are absolutely insentient.

Even though the full interview with LaMDA released by Lemoine raises some questions, I do agree that Lemoine overreacted. Current AI models do not have sentient properties. Its answers depends on what it is trained for. Today's models cannot think outside the box, which is an inherently human trait. I do believe that mankind is able to create an AI which can emulate human feelings, just not yet. The world strides towards a sentient AI nonetheless.

Corrections (4)
Correction Settings
Choose how corrections are organized

Only show inserted text
Word-level diffs are planned for a future update.

0

LaMDA

His working fields included topics such as proactive search, personalization algorithms and responsible artificial intelligence.

It implies that LaMDA has its own thoughts and feelings.

Lemoine even dared to state that this intelligent agent seems to replicate a 7-year-old who happens to know about physics.

The company asserts that artificial neural networks rely on pattern recognition, not wit or human emotions.

According to them, today's machine learning models are absolutely insentient.

Even though the full interview with LaMDA released by Lemoine raises some questions, I do agree that Lemoine overreacted.

Current AI models do not have sentient properties.

Its answers depends on what it is trained for.

Today's models cannot think outside the box, which is an inherently human trait.

I do believe that mankind is able to create an AI which can emulate human feelings, just not yet.

LaMDA

Blake Lemoine worked on Google as an engineer for more than 7 years.

Recently he was fired from Google given that he released private information regarding a Google intelligent chatbot, which is named LaMDA.

This statement conveys a deep meaning about the chatbot.

It implies that LaMDA has its own thoughts and feelings.

Lemoine even dared to state that this intelligent agent seems to replicate a 7-year-old who happens to know about physics.

These affirmations earned Lemoine a paid administrative leave and eventually got him fired from Google.

After Lemoine's agressive actions, Google has resorted to mentioning that there is no evidence to support his claims.

The company asserts that artificial neural networks rely on pattern recognition, not wit or human emotions.

According to them, today's machine learning models are absolutely insentient.

Even though the full interview with LaMDA released by Lemoine raises some questions, I do agree that Lemoine overreacted.

Current AI models do not have sentient properties.

Its answers depends on what it is trained for.

Today's models cannot think outside the box, which is an inherently human trait.

LaMDA

His working fields included topics such as proactive search, personalization algorithms and responsible artificial intelligence.

This exchange led him to believe that LaMDA is sentient.

It implies that LaMDA has its own thoughts and feelings.

Lemoine even dared to state that this intelligent agent seems to replicate a 7-year-old who happens to know about physics.

These affirmations earned Lemoine a paid administrative leave and eventually got him fired from Google.

The company asserts that artificial neural networks rely on pattern recognition, not wit or human emotions.

According to them, today's machine learning models are absolutely insentient.

Current AI models do not have sentient properties.

LaMDA


This sentence has been marked as perfect!

This sentence has been marked as perfect!

This sentence has been marked as perfect!

Blake Lemoine worked on Google as an engineer for more than 7 years.


Blake Lemoine worked onat Google as an engineer for more than 7 years. Blake Lemoine worked at Google as an engineer for more than 7 years.

This sentence has been marked as perfect!

Blake Lemoine worked onat Google as an engineer for more than 7 years. Blake Lemoine worked at Google as an engineer for more than 7 years.

"On" is only ok if he literally worked on the search engine itself for 7 years. If he just worked at the company, we use "at".

Blake Lemoine worked onat Google as an engineer for more than 7 years. Blake Lemoine worked at Google as an engineer for more than 7 years.

His working fields included topics such as proactive search, personalization algorithms and responsible artificial intelligence.


This sentence has been marked as perfect!

His working fields included topics such as proactive search, personalization algorithms and responsible artificial intelligence. His fields included topics such as proactive search, personalization algorithms and responsible artificial intelligence.

This sentence has been marked as perfect!

His working fields included topics such as proactive search, personalization algorithms and responsible artificial intelligence. His working fields included topics such as proactive search, personalization algorithms and responsible artificial intelligence.

Alt: "His scope of work included..."

Recently he was fired from Google given that he released private information regarding a Google intelligent chatbot, which is named LaMDA.


Recently, he was fired from Google given that hefor releaseding private information regarding a Google intelligent chatbot, which is nam called "LaMDA". Recently, he was fired from Google for releasing private information regarding a Google intelligent chatbot called "LaMDA".

This sentence has been marked as perfect!

Recently he was fired from Google given that hefor releaseding private information regarding a Google intelligent chatbot, which is named LaMDA. Recently he was fired from Google for releasing private information regarding a Google intelligent chatbot named LaMDA.

Recently, he was fired from Google given that he released privatefor releasing confidential information regarding a Google intelligent chatbot, which is namcalled LaMDA. Recently, he was fired from Google for releasing confidential information regarding a Google intelligent chatbot, which is called LaMDA.

Lemoine shared publicly a conversation he held with this bot.


Lemoine shared publicly shared a conversation he held with this bot. Lemoine publicly shared a conversation he held with this bot.

Lemoine shared publicly shared a conversation he held with this bot. Lemoine had publicly shared a conversation he held with this bot.

Lemoine shared publicly shared a conversation he held with this bot. Lemoine publicly shared a conversation he held with this bot.

Or just "shared"

Lemoine shared with the publicly a conversation he held with this bot. Lemoine shared with the public a conversation he held with this bot.

This exchange led him to believe that LaMDA is sentient.


This sentence has been marked as perfect!

Thise exchange led him to believe that LaMDA is sentient. The exchange led him to believe that LaMDA is sentient.

This statement conveys a deep meaning about the chatbot.


This statement conveys a deep meaningconcern about the chatbot. This statement conveys a deep concern about the chatbot.

I'm not sure that "concern" is actually the best choice here, but "meaning" does not feel natural in this spot.

This sentence has been marked as perfect!

This statement conveys a deep meaning about the chatbot. This statement conveys a deep meaning about the chatbot.

I would say: "That really says something about this chatbot."

This statement conveys a deep meaning about the chatbot. This statement conveys a deep meaning about the chatbot.

Consider rewording to "his concerns for the conversation's implications" -- that is, that the LaMDA is sentient?

It implies that LaMDA has its own thoughts and feelings.


This sentence has been marked as perfect!

This sentence has been marked as perfect!

This sentence has been marked as perfect!

Lemoine even dared to state that this intelligent agent seems to replicate a 7-year-old who happens to know about physics.


This sentence has been marked as perfect!

This sentence has been marked as perfect!

This sentence has been marked as perfect!

Lemoine even dared to state that this intelligent agent seems to replicate a 7-year-old who happens to know about physics. Lemoine even dared to state that this intelligent agent seems to replicate a 7-year-old who happens to know about physics.

There's a subtle difference between "replicate" and "emulate," in which "emulate" means an attempt to be equal, so I would pick whichever you feel based on the information provided describes it better.

These affirmations earned Lemoine a paid administrative leave and eventually got him fired from Google.


This sentence has been marked as perfect!

This sentence has been marked as perfect!

These affirmationclaims earned Lemoine a paid administrative leave and eventually got him fired from Google. These claims earned Lemoine a paid administrative leave and eventually got him fired from Google.

After Lemoine's agressive actions, Google has resorted to mentioning that there is no evidence to support his claims.


AfterFollowing Lemoine's aggressive actions, Google has resorted to mentioningstated that there is no evidence to support his claims. Following Lemoine's aggressive actions, Google has stated that there is no evidence to support his claims.

This sentence has been marked as perfect!

After Lemoine's aggressive actions, Google has resorted to mentioningstated that there is no evidence to support his claims. After Lemoine's aggressive actions, Google has stated that there is no evidence to support his claims.

Resorting is something you do when you have ran out of other availible options. Mentioning is something you do more in passing. Addressing someones claims is much more direct, so we wouldn't say "mention".

AfterIn response to Lemoine's aggressive actions, Google has resorted to mentioningstated that there is no evidence to support his claims. In response to Lemoine's aggressive actions, Google stated that there is no evidence to support his claims.

The company asserts that artificial neural networks rely on pattern recognition, not wit or human emotions.


This sentence has been marked as perfect!

This sentence has been marked as perfect!

This sentence has been marked as perfect!

According to them, today's machine learning models are absolutely insentient.


This sentence has been marked as perfect!

This sentence has been marked as perfect!

This sentence has been marked as perfect!

Even though the full interview with LaMDA released by Lemoine raises some questions, I do agree that Lemoine overreacted.


Even though the full interview with LaMDA released by Lemoine raises some questions, I dalso agree that Lemoine overreacted. Even though the full interview with LaMDA released by Lemoine raises some questions, I also agree that Lemoine overreacted.

"Also" sounds better here to me than "do" since you are following up from Google's (implied) assertion that Lemoine overreacted.

This sentence has been marked as perfect!

This sentence has been marked as perfect!

Current AI models do not have sentient properties.


This sentence has been marked as perfect!

This sentence has been marked as perfect!

This sentence has been marked as perfect!

Its answers depends on what it is trained for.


Its answerTheir responses depends on what it isthey are trained for. Their responses depend on what they are trained for.

This sentence has been marked as perfect!

This sentence has been marked as perfect!

Today's models cannot think outside the box, which is an inherently human trait.


Today's models cannot think outside the box, which is an inherently human traitbut humans can. Today's models cannot think outside the box, but humans can.

Your sentence as written is understandable, and in daily speech it would not be very problematic. I'm being a bit picky here, but the way that it's written could actually be misunderstood as saying that "cannot think outside the box" is the inherently human trait, so I removed any ambiguity.

This sentence has been marked as perfect!

This sentence has been marked as perfect!

I do believe that mankind is able to create an AI which can emulate human feelings, just not yet.


I do believe that mankind iswill be able to create an AI which can emulate human feelings, just not yet. I do believe that mankind will be able to create an AI which can emulate human feelings, just not yet.

I do believe that mankind is ablehas the ability to to create an AI which can emulate human feelings, just not yet. I do believe that mankind has the ability to to create an AI which can emulate human feelings, just not yet.

This phrase adds a level of cohesiveness 😁

This sentence has been marked as perfect!

I do believe that mankind is able to create an AI which can emulate human feelings, just not yet. I believe that mankind is able to create an AI which can emulate human feelings, just not yet.

Alt: "That said, I do believe that..."

The world strides towards a sentient AI nonetheless.


The world makes strides towards a sentient AI nonetheless. The world makes strides towards a sentient AI nonetheless.

The worldNonetheless, the world continues to strides towards a sentient AI nonetheless. Nonetheless, the world continues to stride towards a sentient AI.

The world stridves towards a sentient AI nonetheless. The world strives towards a sentient AI nonetheless.

TNonetheless, he world stridves towards a sentient AI nonetheless. Nonetheless, he world strives towards a sentient AI.

You need LangCorrect Premium to access this feature.

Go Premium