How many parameters will GPT-4 have?

Post Reply

How many parameters will GPT-4 have?

1 quadrillion or more
0
No votes
100 trillion or more, but less than 1 quadrillion
0
No votes
10 trillion or more, but less than 100 trillion
0
No votes
1 trillion or more, but less than 10 trillion
5
42%
175 billion or more, but less than 1 trillion
7
58%
 
Total votes: 12

User avatar
wjfox
Site Admin
Posts: 8885
Joined: Sat May 15, 2021 6:09 pm
Location: London, UK
Contact:

How many parameters will GPT-4 have?

Post by wjfox »

I realise we have the proto-AGI and a couple of related threads, but I wanted to discuss this topic specifically. :) I'll probably lock it at some point.

So, below is a little animation I put together. It shows GPT-1, 2, and 3, with varying exponential trend lines depending on GPT-4.

For those who might be new to this forum, and/or unfamiliar with AI language models, GPT is a series of programs that uses deep learning to create human-like text. Think of them as like very, very advanced forms of smartphone autocorrect. The most recent of these, GPT-3, emerged in June 2020 and has demonstrated phenomenal capabilities, with dialogue that often seems like a real person.

This is largely thanks to its massive number of parameters, which can be thought of as like the individual synapses in a brain. GPT-3 has orders of magnitude more parameters than its predecessors. The program isn't perfect, however.

An even more advanced version, GPT-4, is strongly rumoured to be releasing this year. Estimates of the parameter count vary wildly – ranging from those who believe it won't be much larger than the 175 billion of GPT-3, to those who predict another huge leap with perhaps 100 trillion or more.

Rather like the "megahertz myth" of the early 2000s, and the qubit claims of D-wave Systems, it may be that we're reaching a point where large gains in parameter counts don't actually matter as much. Perhaps other factors will now be more important in determining how productive and efficient these language models are.

Anyway, the exact number of parameters that will feature in GPT-4 is publicly unknown, but I wanted to get the opinions of FT forumers. What do you think it will be? And does it even matter that much? Please vote in the poll I'm about to create, thanks! :)


Image
User avatar
Yuli Ban
Posts: 4641
Joined: Sun May 16, 2021 4:44 pm

Re: How many parameters will GPT-4 have?

Post by Yuli Ban »

If it's dense, probably 1 trillion at max. A dense 1 trillion GPT-4 sounds terrifying. But even if it's still only around 175 billion, considering how much more efficient scaling has become since 2020, it'd be ridiculously too good.

If it's a mixture of experts/MoE, than maybe, just barely maybe, it could be well over 10 trillion. Though it'd not be nearly as strong.
And remember my friend, future events such as these will affect you in the future
TrueAnimationFan
Posts: 120
Joined: Wed May 19, 2021 8:00 pm

Re: How many parameters will GPT-4 have?

Post by TrueAnimationFan »

Something in my gut says just above 1 trillion. Maybe 1.5 or 2 trillion.
User avatar
Ozzie guy
Posts: 487
Joined: Sun May 16, 2021 4:40 pm

Re: How many parameters will GPT-4 have?

Post by Ozzie guy »

If they want to make a bold statement about 3 years after GPT3, being over 1 trillion rather than below it sounds good for PR. OpenAI is probably above that kind of logic in decision making but it is a possible influence.
Jakob
Posts: 106
Joined: Sun May 16, 2021 6:12 pm

Re: How many parameters will GPT-4 have?

Post by Jakob »

Stacking on more and more parameters is only practical up to a point. Can't say what that point is because I don't know shit about AI but 300-500 billion sounds reasonable for GPT4.
Jakob
Posts: 106
Joined: Sun May 16, 2021 6:12 pm

Re: How many parameters will GPT-4 have?

Post by Jakob »

As it stands I'm already very impressed with GPT3 so GPT4 will be sure to blow people's minds.
Jakob
Posts: 106
Joined: Sun May 16, 2021 6:12 pm

Re: How many parameters will GPT-4 have?

Post by Jakob »

As it stands I'm already very impressed with GPT3 so GPT4 will be sure to blow people's minds. It seems crazy that just 10 years ago chatbots were utter crap (anyone remember cleverbot? 🤮 )and now we have ChatGPT.
User avatar
ººº
Posts: 359
Joined: Fri Sep 16, 2022 3:54 am

Re: How many parameters will GPT-4 have?

Post by ººº »

And what are parameters?
:?:
User avatar
raklian
Posts: 1754
Joined: Sun May 16, 2021 4:46 pm
Location: North Carolina

Re: How many parameters will GPT-4 have?

Post by raklian »

ººº wrote: Mon Jan 02, 2023 8:15 pm And what are parameters?
:?:
Simply put, parameters in machine learning and deep learning are the values your learning algorithm can change independently as it learns and these values are affected by the choice of hyperparameters you provide. Some have equated parameters to our own brain synapses if it's easier to visualize the concept this way.

Learn more here: https://towardsdatascience.com/paramete ... 609601a9ac
To know is essentially the same as not knowing. The only thing that occurs is the rearrangement of atoms in your brain.
User avatar
Yuli Ban
Posts: 4641
Joined: Sun May 16, 2021 4:44 pm

Re: How many parameters will GPT-4 have?

Post by Yuli Ban »

Jakob wrote: Mon Jan 02, 2023 8:12 pm As it stands I'm already very impressed with GPT3 so GPT4 will be sure to blow people's minds. It seems crazy that just 10 years ago chatbots were utter crap (anyone remember cleverbot? 🤮 )and now we have ChatGPT.
I remember very clearly. I remember creating whole threads talking about how unimpressive Cleverbot was on this very forum's papa version (I'm pretty sure you could even find that thread archived somewhere). I distinctly remember the frustration of the JOHN Test, where I tried getting Cleverbot to rationalize how to spell John:
"JHN. Where do you put the O to spell John?"
And it ALWAYS failed. It failed. Another chatbot that Kurzweil made (Ramona, I think it was?) failed even harder somehow. A deep reinforcement learning-based chatbot I tried out in 2016 or 2017 failed spectacularly. It was astounding to me that these chatbots could be thwarted by an impossibly simple task.

The fact ChatGPT succeeded on its first try blew my mind. And then it went above and beyond when I asked it "WHY does the O go between the J and H?" and it responded logically that the O is a vowel and that placing it elsewhere would cause the name to be pronounced differently.

Just.... what!!!! WHAT!!!!!!!!
And remember my friend, future events such as these will affect you in the future
Post Reply