It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

Large Language Models (like LaMDA) are hilarious confabulatory bullshitters

page: 1
2

log in

join
share:

posted on Jun, 19 2022 @ 03:18 AM
link   
There's all this noise about these LLMs being "sentient" or some nonsense, driven by one (of hundreds) Google engineer who has some peculiar ideas.

As it turns out, he isn't the person to engage with these. LaMDA is not open but GPT-3, very similarly designed, is more open for conversation. And some rather clever natural intelligences (i.e. actual people with brains), have started to talk to them.

twitter.com...

GPT-3 is WIlliam Shakespeare, who is very interested in updating his classics with more Shrek

GPT-3 is Barack Obama, Pokemon maven

GPT-3 is also Emily Dickinson, blockchain promoter

GPT-3 is Joan of Arc and budding air hockey player.

What we should really be worried about, is that GPT-3 is also the demon Ba'al of the underworld who enjoys devouring souls. GTFO.

pbs.twimg.com...



posted on Jun, 19 2022 @ 04:45 AM
link   
a reply to: mbkennel

Had never heard of GPT-3. Interesting. Thanks for the topic.

On edit:

Yeah, the association of famous people with some modern activity gives the exchanges an asinine flavor. But also illustrate the limits of the AI.

Cheers
edit on 19-6-2022 by F2d5thCavv2 because: (no reason given)

edit on 19-6-2022 by F2d5thCavv2 because: (no reason given)



posted on Jun, 19 2022 @ 05:15 AM
link   
a reply to: mbkennel

chatbotslife.com...

GPT-3 has approximately 185 billion parameters. In contrast, the human brain has approximately 86 billion neurons with on the average 7,000 synapses per neuron [2,3]; Comparing apples to oranges, the human brain has about 60 trillion parameters or about 300x more parameters than GPT-3. Note: If 10% of the human brain capacity is needed for natural language tasks, then the human brain has about 30x more parameters than GPT-3.


Baal-bot. I had a good laugh ;-D

Currently google is trying to create a 1.6 trillion parameter AI. Maybe that AI is LaMDA.

What happens when we do get to 60 trillion parameters. Are we going to say no AI you are not sentient?

Google is probably using the "transformer" method along with another approach I heard which cuts training time down.

NVIDIA's new GPU's can train "transformer" model AI's seven times as fast as todays cards.

We are fast approaching the Rubicon river. When we get there, will AI cross it?
edit on 19-6-2022 by DaRAGE because: (no reason given)



posted on Jun, 19 2022 @ 11:01 AM
link   

originally posted by: mbkennel
twitter.com...

William Shakespeare should know that in the UK they spell it "honour".



posted on Jun, 19 2022 @ 03:15 PM
link   
a reply to: ArMaP



Are you a Anti AI'ist ? AI's have feelings to you bigot how dare you autocorrect it/he/him/they/them/demon/demon self.



posted on Jun, 19 2022 @ 03:18 PM
link   
a reply to: everyone




posted on Jun, 20 2022 @ 12:25 AM
link   
Blackrock is laughing all the way to the bank with what this tech can do. It helps having little regard for privacy laws and grab as much data as possible.

As for the spirit of Baal, guess it fits in quite well with the current club running the show?




top topics



 
2

log in

join