Jump to content

Welcome to FutureTimeline.forum
Register now to gain access to all of our features. Once registered and logged in, you will be able to create topics, post replies to existing threads, give reputation to your fellow members, get your own private messenger, post status updates, manage your profile and so much more. If you already have an account, login here - otherwise create an account for free today!
Photo

Will Moore's Law break down?

Moores Law Tecnology AI Singularity growth quantum computers future silicon quantum computer

  • Please log in to reply
10 replies to this topic

#1
Galgation

Galgation

    Member

  • Members
  • PipPip
  • 16 posts

Moore's Law is the principal that computer scientist Gordon Moore discovered, stating that every two years (or 18 months) the number of transistors on a circuit double, affectingly double the power a computer has at it's disposal. This takes place do to that fact that transistors are getting smaller and smaller, but Moore's Law has a limit, and that is the simple fact that transistors can be no smaller than 5 nanometers, or else they begin to short circuit. Now we still have some time, being that transistors now are roughly 40 nanometers, that gives us roughly six years before we reach the end of Moore's Law. What do you think of Moore's Law, is there anything we can do to prevent it's collapse? 



#2
EpochSix

EpochSix

    Member

  • Members
  • PipPipPipPip
  • 175 posts

Just as we've moved from electromechanical technology to relays, from relays to vacuum tubes, from vacuum tubes to transistors, and from transistors to integrated circuits we will move to a new foundational technology when Moore's Law meets its physical limits, likely 3 dimensional integrated circuits instead of circuit boards, or maybe quantum computing, or maybe biocomputing (using biologically derived molecules, such as DNA and proteins, to perform computational calculations involving storing, retrieving, and processing data).

 

Posted Image


Edited by EpochSix, 15 August 2013 - 11:53 PM.


#3
zEVerzan

zEVerzan

    Orange Animating Android

  • Members
  • PipPipPipPipPipPipPipPipPip
  • 3,743 posts
  • LocationSome Underground Sweatshop Probably

Let's not get ahead of ourselves. We'll probably use 3D transistors, then from there go on to quantum computing.

 

Posted Image

 

Encoding information onto DNA likely won't work because as I've said before and I'll say it again: DNA is highly mutable, which is the entire point of its existence.


Edited by EVanimations, 16 August 2013 - 01:27 AM.

I always imagined the future as a time of more reason, empathy, and peace, not less. It's time for a change.
Attention is currency in the "free marketplace of ideas".
I do other stuff besides gripe about the future! Twitter Youtube DeviantArt +-PATREON-+

#4
FutureGuy

FutureGuy

    Member

  • Members
  • PipPipPipPipPipPip
  • 557 posts

Moore's Law would effectively fade out if we kept on going with current methods, that is. Every time we go from one technology/method to the next one, Moore's Law kicks in again. The next step is 3 dimensional transistors. I'm expecting Moore's Law to keep up for a couple decades at least. 



#5
Yuli Ban

Yuli Ban

    Born Again Singularitarian

  • Moderators
  • PipPipPipPipPipPipPipPipPipPipPip
  • 22,010 posts
  • LocationNew Orleans, LA

Why would it break down when we could fuse classical computer with quantum computing?

We already have the D-wave.

http://www.gizmag.co...-ranking/27476/


And remember my friend, future events such as these will affect you in the future.


#6
Squillimy

Squillimy

    Member

  • Members
  • PipPipPipPipPipPip
  • 924 posts

I can't wait to see what moore's law continues on! And it's exciting because we only have to wait ~7 years


What becomes of man when the things that man can create are greater than man itself?


#7
Squillimy

Squillimy

    Member

  • Members
  • PipPipPipPipPipPip
  • 924 posts
Why would it break down when we could fuse classical computer with quantum computing?

We already have the D-wave.

http://www.gizmag.co...-ranking/27476/

 

Yes but the D-Wave is huge as fuck and I don't believe it does classical computing; it focuses on problems that a quantum computers handle better than classical computers. Such as search algorithms or graph theory problems (such as the traveling salesman problem in that article)


What becomes of man when the things that man can create are greater than man itself?


#8
Raklian

Raklian

    An Immortal In The Making

  • Moderators
  • PipPipPipPipPipPipPipPipPipPip
  • 7,171 posts
  • LocationRaleigh, NC
Yes but the D-Wave is huge as fuck and I don't believe it does classical computing; it focuses on problems that a quantum computers handle better than classical computers. Such as search algorithms or graph theory problems (such as the traveling salesman problem in that article)

 

 

The article states the following -

 

"The question remains, just how useful is D-Wave's chip? Its current approach to quantum computing is focused tightly on problems that map nicely onto an Ising model. However, it is known that adiabatic quantum computing can efficiently reproduce any computation of which a more conventional quantum gate computer is capable."

 

This gives us the possibility that one day quantum computing will replace all of the everyday computation we take for granted with our classical computers.


What are you without the sum of your parts?

#9
RayMC

RayMC

    F̷̲̅ᴜ̷̲̅ᴛ̷̲̅ᴜ̷̲̅ʀ̷̲̅ᴇ̷̲̅

  • Members
  • PipPipPipPipPip
  • 262 posts
This has been discussed so many times and the answer is simply no. Not in the foreseeable future, which is a good thing. Using new technologies like 3D transistors and/or graphene, Moore's law will keep up for many decaes to come

“Wʜᴀᴛ ɪs ɴᴏᴡ ᴘʀᴏᴠᴇᴅ, ᴡᴀs ᴏɴᴄᴇ ᴏɴʟʏ ɪᴍᴀɢɪɴᴇᴅ.” - Wɪʟʟɪᴀᴍ Bʟᴀᴋᴇ


#10
Logically Irrational

Logically Irrational

    For Lack of a Better Name

  • Members
  • PipPipPipPipPipPipPip
  • 1,547 posts
  • LocationHoover Dam

I think that it's a certainty that the next stage in computing will be 3D integrated chips. After that we'll get into quantum computers or maybe optical computers. These two stages alone could get us into the 2040s, maybe even the 2050s depending on the ultimate potential of quantum computers. Beyond that, I don't think anyone could really say.

 

What's interesting though is that each stage of the evolution of computers is often not practically realized until the stage immediately prior to it (or at least nearly so). The first practical theory and demonstration of point-contact transistors occurred in the lat 1940s during the era of vacuum tubes. The idea for the integrated circuit was conceived in the late 1950s, on the cusp of the era of transistors. The long term eras of computing are not often predictable more than two stages ahead it would seem. The fact that we don't have a complete grasp of where computer research will take us doesn't mean we have nowhere left to go.


Ph'nglui mglw'nafh Cthulhu R'lyeh wgah'nagl fhtagn!

#11
Piman

Piman

    Member

  • Members
  • PipPip
  • 19 posts

Moore's Law, which simply describes the number of transistors we can put on a chip as doubling every couple years, will of-course hit a wall, as they cannot shrink forever.  But the exponential growth of computing will not (yet).  1 2

 

The term is sometimes used as a synonym for the growth of computing; so people use the term in two different ways.







Also tagged with one or more of these keywords: Moores Law, Tecnology, AI, Singularity, growth, quantum, computers, future, silicon, quantum computer

0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users