[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 99 KB, 900x588, silicon.jpg [View same] [iqdb] [saucenao] [google]
12104082 No.12104082 [Reply] [Original]

What will replace silicon chips in order to keep Moore's law going?
By replace I mean within the next ~5 years from now, as silicon really hits the final wall.

>> No.12104101

Honestly, at this point computers are truly astonishingly powerful. You can fit a computer capable of decent self-driving in a car now. Aside from edge cases,more raw computational power isn't that useful. By the time 3nm shit becomes a reality I think we will adapt and spend a lot of time refining things like power consumption and software efficiency.

I suppose some madman could figure out how to grow crystalline 3d computer chips that produce almost no heat so we could go from thin sheets of chips to big solid chunks with orders of magnitude more processors. But that sounds crazy.

>> No.12104110

>>12104101
Wait, aren't the chip wafers already on top of each other?

>> No.12104115

>>12104110
Imagine a 5 inch cube of pure transistors that generated almost no heat-that's what i'm talking about.

>> No.12104116

>>12104101
Silicon chips are not powerful enough to simulate 100,000,000,000 people on a computer the size of a building. We must continue Moore's law.

>> No.12104123

I think we have computers powerful enough for basic AIs now. It's all a question of creating flexible memory-enabled NN systems that an solve a wide variety of tasks by possessing something akin to awareness. By 2025 I think the computers will be overkill for AI.

>> No.12104129

Intel, the absolute psychopaths,think they can push down to 1.4 nm by 2029. Big if true.

>> No.12104212
File: 2 KB, 69x124, basedcellphone.jpg [View same] [iqdb] [saucenao] [google]
12104212

>>12104116
>>12104129
BASED

>> No.12104229

No amount of computing developments will get me a qt gf

>> No.12104518

>>12104110
they are 3D structures but the wafers are not really "on top of each other", that anon is talking about making a full superstructure with waffers on "top of each other" obviously interconected this way acting as a single unit this will allow maximum space efficienfy (same reason for building skycrappers, you get a huge efficiency/use out of every square foot of your ground) but the problem in doing chips this way is heat
right now we are in a intermediary phase where with "multicores" where the waffers are not "on top of each other" but side by side and in their own encapsule to dissipate heat

i agree with that anon that well designed software can improve dramatically the general performance of an existing chip without the need for new faster ones but obviously this is not a good idea for the chip industry, unironically a lot of software companies make bloated programs "almost" on purpose so the whole industry can move

"almost on purpose" meaning making a streamlined program is not a top priority
i'm typing this on a 10 year old notebook and I don't have the need for a new one, but if I was into the latest games I would be basically obligated to be constantly upgrading, interestingly enough the whole industry moves on gaming alone

>> No.12104759

>>12104518
>whole industry moves on gaming alone

You're god damn right.

Also, to answer OP, graphene processors, as soon as we figure out how to litho those, and they not suck.

>> No.12104763

>>12104518
Who would ever want a degree in electrical engineering.... CS chads, point and laugh!!!

>> No.12104792

>>12104229
So what will you do? Always make a new thread and state your personal problem.

>> No.12104841

>>12104082
quantum computing will break Moore's law and give us almost unlimited computing power

>> No.12104923

>>12104841
Quantum computing isn't useful for many problems and is slower than classical computers for the majority of problems.

>> No.12105036

>>12104101
Synchronous computer cpus waste roughly 70% of energy in the form of heat due to clocks keeping things in order. Real raw power comes from asynchronous infrastructures which, to my knowledge, are only being used by billion dollar HFT firms.

>> No.12105038

>>12104082
aluminum picotransistors

>> No.12105039

>>12104115
It doesn't matter because clock keeping is a really wasteful process. Synchronous computers are really inefficient by design.

>> No.12105119

>>12104923
progress in quantum computing is much faster
in 3 years quantum computers improved as much as classical computers in 20 years
https://www.scientificamerican.com/article/googles-quantum-computer-achieves-chemistry-milestone/

>> No.12105265

>>12104763
well...

I actually studying EE

>> No.12105435

>>12105119
the point is that quantum computers will not help us at all for many problems and so it's a completely different computer. It's not what we're looking for when we look for classical upgrades.
>>12105038
I can't find anything on these chips. Do you have some links to articles?

>> No.12105565

>>12104101

>software efficiency

This, the FPSes I play run smoother than my fucking facebook tab

>> No.12105920

>>12105435
name one of these problems

>> No.12106038

>>12105920
League of Legends broken client

>> No.12106043

>>12104082
Zero-G perfect defect free fabs. Proof of concept already exists, perfect optical glass fibers have been made in space.

>> No.12106467

>>12105920
Read this
https://www.forbes.com/sites/chadorzel/2017/04/17/what-sorts-of-problems-are-quantum-computers-good-for/#5d1a4ba2547a

>> No.12107533

>>12104082
Photonic calculation...

Basically from elektricity to light. I heard they are evolving some processors that use light and are suitable for incredible faster computation, and hidden terminator AI is waiting just for that.

>> No.12108055

I have no idea
t. microelectronics engineer

>> No.12108411

>>12107533
Photon chips are trash

>> No.12108423

>>12108055
Only real answer. No alternative technology is good enough to replace silicon, and likely there isn't going to be one. High level optimization is the way to go now the lowest level is as close to perfect as possible.

>> No.12108424

>>12108423
If that's true though, then the entirety of Kurzweils singularity stuff is completely impossible and false.

>> No.12108429

>>12108055
You're probably more equipped than most of us here for tackling this question. Maybe look up some papers and come back to us if you find anything anon (assuming you have the time). My very uninformed guess would be that we move away from computation via micro circuitry though and that whatever follows will require a lot of first principles and fundamental physics to find.

>> No.12108432

>>12108424
Who coulda thunk it?

>> No.12108498

>>12108432
But I want to live in a virtual fantasy world with amazon elf women fighting dragons and finding treasure. I refuse to accept the idea that computation is being limited already. I know there is an upper bound on computation but fucking silicon chips at 7nm? GTFO that's insane I refuse.

>> No.12109107

>>12104082
Analog computation.

>> No.12109465

>>12104129
>think they can push down to 1.4 nm by 2029
That is just Marketing going into overdrive. They lost connection with reality years ago.

>> No.12109597

>>12109465
proof?

>> No.12109608

>>12104129
Hahahahahahha
They have been stuck at ten times that for 6 years now

>> No.12109614

>>12104082
I'd say composite boron-nickel spike field effect transistors are the future.
Have you seen what those fuckers can do?

>> No.12109621

>>12104082
Probably some V-III crystals in quantum computers will eventually hit supremacy, then it'll become cheaper to manufacture and that'll be it for Si.

>> No.12110519

>>12104101
I suppose, that electrical chips, which is what I don't longer imagine while hearing computer chip is going to quantum tunnel whatever you do with it unless you cover it in layer of helium-4 twisted to the side or hydrogen atoms, even at 7m, because otherwise gap is too large, this way you've got electron in tunnel attempt got crashed in neutron, but I don't know how you make few atoms thick deuterium layer that stays there while room temperature.

>> No.12110522

>>12110519
Maybe bombarding walls of junctions with neutrons so it remains more stable.

>> No.12110545

>>12104082
Silicon boron carbide have better properties than just silicon... Maybe there's other possibility, neurons on chip, and maybe even other thing... Learning to do maths yourself.

This computers are probably unusable in open space by year 2006, maybe sooner, another thing is that why not graphene computers?

And final thing is... What the hell do you want to compute? Like literally you don't need much more computers than we have now unless you want make computers play games against each other but that is useless hazard with planet.

>> No.12111063

>>12110545
It's never wrong to have more computational power.
t. Unscrupulous machine learning fuck

>> No.12111101

>>12104101
>Aside from edge cases,more raw computational power isn't that useful.
>Laughs in C++ compile time

>> No.12111111

>>12104082
Any work being done on organic computing? Take 2 neurons from a chicken or something, make them react to some binary stimulus accordingly; scale out; boom, ChickenBrain accelerator card, now in PCIe5!

>> No.12111131

>>12111111
Why aren't we funding this?

>> No.12111166
File: 13 KB, 720x702, 1593999616908.jpg [View same] [iqdb] [saucenao] [google]
12111166

Trinary or any base higher than 2. the only reason these didn't kick off is that they are a bit more complicated for code monkeys and binary is already in everything

>> No.12111378

>>12107533

Still requires electronic-photonic-electronic.... conversions of the logic

>> No.12111379

>>12111166

Any examples of such chips?

>> No.12111433

>>12108429
The big problem is that currently, we can't process any material nearly as good as we process silicon. Microelectronics industry is about 70 years old now, and it's 70 years old working on silicon. There are new materials that looks great for a variety of applications, but none of them can be processed as efficiently as silicon for now, and it would take decades of research to bring a new one to the level of mastery we have on silicon

>> No.12111454

>>12111379
There is a kind of transistor which can make quantized current levels but I can't remember what it's called

>> No.12111463

>>12104082
>~5 years
>replace silicon
Lol no, silicon has so much inertia behind it we will see it refined again and again for a long time before we see chips made of different materials.

>> No.12111472

>>12111463
What do you mean refined? Once we hit the physical limit there is no more refinement.

>> No.12111478

>>12111472
We will never reach "physical limits" only tend asymptotically to it

>> No.12111482

>>12111478
There is no difference

>> No.12111487

>>12111482
The difference is we can still spend years pumping money into this technology for minimal improvements

>> No.12111495

>>12111487
Lol true

>> No.12111502

>>12111472
The "limit" to silicon is heat management, there is plenty we can do to combat that if you are talking desktops and servers, for mobile devices it's a much bigger challenge but with more advanced web based programs and always improving network speeds I could see lots of loading being moved off the devices.
Sure silicon is going to hit deminishing returns hard eventually but we aren't going to be buying diamond Ryzens 5's.

>> No.12111537

Good coding practices would do more for improving performance than faster chips. Moore's Law has been covering for increasingly terrible coding for the past couple of decades. "Everyone should learn to code" has lead to insanely terrible inefficient code. There's a reason why CS programs teach things like Big-O notation but it doesn't matter if the hardware is always getting faster. Now that it looks like we might hit a wall on hardware performance, there are lots of gains to be made on the software side with a little bit of effort.

>> No.12111540

At some point we'll replace electrons with photons but research into this has been minimal so far because it really would be like starting over, meaning optical computing would need many generations of Moore's Law like improvements before it could compete with electron based gates.

>> No.12111554

>>12111537
It's funny how against oprimization some people have become. I have started using "bit" instead of "int" where I can in C# to use 1/4th the bytes and I'm getting shit for it.
>it's 3 bits out of 16bg
>why do you care

>> No.12113660

>>12111166
We abandoned that in the 70's

ITT: Bitches that don't know about my nigga DeMorgan

>> No.12113868

>>12109597
They've been trying to break the 10nm mark for about 5 years or so.

>> No.12114213

>>12104101
>JS
>Python
>webshit
Anon, I'm sorry to inform you, but the industry has been wasting all these resources away.

Though, I won't be sad if hitting the wall will require programmers to be actually competent.

>> No.12114509

Virtual chips that are faster than the hardware counterparts running them.

I guarantee this will be in the future.

>> No.12114552

>>12114509
sooo you think it will be processing power streaming to devices using massive server farms?

ahahahahaAHahah ...nope. why? cost and demand. there isnt actually much of a demand outside of the video game industry for stringer processors and the cost of upkeep on the servers is to high. it ways cheaper to make processors than do what you propose.

after all you couldnt be implying that some one could simulate a processor on hardware and for the result of the simulated processor computing things to be faster than the hardware its on doing the same computation. not even /b/ would put such bad bait forward

streaming services shut down after a year . they tried videogame streaming before it failed. yes play with soem companies hardware in your own house and it failed due to latency and subsequent input lag due to distances. that isnt overcome no mater how good your internet connection is. google is just being dumb and walking into a lions den. people catch when there is a frame rate drop from 45 ffps to 40 for instance. its a common complaint that there are performance dip on steam forums. input lag is unforgivable and a byproduct of electricity not teleporting and instead having to travel miles

>> No.12114567

>>12113660
they were abandoned because binary already had kicked off and anything higher than base2 is kinda difficult.
they might come back but for very specific use cases.
anyway there is still a lot of room for improvement, size isn't everything

>> No.12114569

>>12111554
AAAGH WHY DO PEOPLE DO THIS. I swear, optimization died the moment people figured they didn't need to bother.
Like, I get not wanting to perfectly optimize every line, but COME THE FUCK ON. These clowns have over THIRTY fucking variables, all set as int, and they're using them exclusively as booleans. Shit makes me want to shoot myself.
And god save us from the IF spaghetti.

>> No.12114738

>>12111111

Checks of chicken neurons

>> No.12114760

>>12114552
>not even /b/ would put such bad bait forward
Yet you feed the trolls......

>> No.12116745

>>12111111
wasted sixths

>> No.12116920

>>12114213
>Though, I won't be sad if hitting the wall will require programmers to be actually competent.
That would require a desire for quality, which itself requires a willingness to pay well. Currently there is an attempt to flood the software programming market with so many third worlders and ghettos dwellers that developers will work for minimum wage, quality be damned.

>> No.12116933

>>12104116
what about a big computer?

>> No.12117089

>>12104082
Carbonnano materials (or some composite thereof), probably.

>> No.12117851

>>12114552
It'll come back around.

Bandwidth limits are temporary.

The end of Moore's law is forever.

>> No.12117856

>>12114552
>>12117851
By that logic, for a good long term stock, now would be a great time to buy into companies that do this type of streaming.

Everyone thinks they suck right now, but when the whole world has 10GB+ internet connectivity, via optical connection, they become a fantastic idea again.

>> No.12118974

OP here. So after reading through the thread and looking into the various materials and other methods posted, I have come to the conclusion that Moore's law is actually done for this time and computers are not going to get any more powerful than they are now for at least several decades.

>> No.12119004

>>12109608
There is a world of difference between scientifically confirming a method and then putting that into mass production. It is the latter that Intel failed on recently. The R&D is usually 5 to 10 years ahead of the current tech.