It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

IBM builds its most powerful universal quantum computing processors

page: 1
13
<<   2 >>

log in

join
share:

posted on May, 17 2017 @ 12:46 PM
link   
This was posted a couple of days ago - China builds five qubit quantum computer amazing innovation.

MIT already had one (5 qubits). IBM allowed access on the *waves arms* CLOUD (also 5 qubits). There is a dev kit on github for it. So why China? And why five qubits? Ya’ll know what is coming, right?


The two new IBM-developed processors include:

A 16 qubit processor that will allow for more complex experimentation than the previously available 5 qubit processor. It is freely accessible for developers, programmers and researchers to run quantum algorithms and experiments, work with individual quantum bits, and explore tutorials and simulations. Beta access is available today through a new Software Development Kit available on GitHub github.com...

IBM's first prototype commercial processor with 17 qubits and leverages significant materials, device, and architecture improvements to make it the most powerful quantum processor created to date by IBM. It has been engineered to be at least twice as powerful as what is available today to the public on the IBM Cloud and it will be the basis for the first IBM Q early-access commercial systems.· A second experimental processor that has 16 qubits and will allow for more complex experimentation than the previously available 5 qubit processor freely accessible for developers, programmers and researchers to run quantum algorithms and experiments, work with individual quantum bits, and explore tutorials and simulations. Beta access is available today through a new Software Development Kit available on GitHub github.com...

Phys.org, May 17, 2017 – IBM builds its most powerful universal quantum computing processors.

-and-

ArsTechnica – Bigger is better: Quantum volume expresses computer’s limit.

The Ars Technica article explains how to measure quantum computing by volume rather than just by qubit number. There is a lot of error correction that goes on with quantum computing (there are still super-cooled computers down near absolute zero) and IBM is arguing that over all volume of quantum computing operations is what should be measured. Which is funny because their new 16-qubit machine is only a modest gain over their 5-qubit one assuming the 5q has perfect fidelity (which it doesn’t). This is also a universal quantum computer which blows that D-Wave annealing machine out of the water.

Anyway, these types of announcements are bound to happen through the end of the year. The goal is "quantum supremacy" where a quantum computer will out pace a supercomputer. IBM has plans to make a 50-qubit computer by the end of the year.

Here is round one!
edit on 17-5-2017 by TEOTWAWKIAIFF because: formatting and spelling



posted on May, 17 2017 @ 12:48 PM
link   
a reply to: TEOTWAWKIAIFF




Ya’ll know what is coming, right?


No, can you explain please?



posted on May, 17 2017 @ 12:56 PM
link   
This is our Q at its finest:




B-Can you do a dragon shout ?

Q-Damn.

B-Damn ?, explain please.

Q-You don't react to what I'm saying at all.


Buck



posted on May, 17 2017 @ 12:59 PM
link   

originally posted by: TEOTWAWKIAIFF
This was posted a couple of days ago - China builds five qubit quantum computer amazing innovation.

MIT already had one (5 qubits). IBM allowed access on the *waves arms* CLOUD (also 5 qubits). There is a dev kit on github for it. So why China? And why five qubits? Ya’ll know what is coming, right?


The two new IBM-developed processors include:

A 16 qubit processor that will allow for more complex experimentation than the previously available 5 qubit processor. It is freely accessible for developers, programmers and researchers to run quantum algorithms and experiments, work with individual quantum bits, and explore tutorials and simulations. Beta access is available today through a new Software Development Kit available on GitHub github.com...

IBM's first prototype commercial processor with 17 qubits and leverages significant materials, device, and architecture improvements to make it the most powerful quantum processor created to date by IBM. It has been engineered to be at least twice as powerful as what is available today to the public on the IBM Cloud and it will be the basis for the first IBM Q early-access commercial systems.· A second experimental processor that has 16 qubits and will allow for more complex experimentation than the previously available 5 qubit processor freely accessible for developers, programmers and researchers to run quantum algorithms and experiments, work with individual quantum bits, and explore tutorials and simulations. Beta access is available today through a new Software Development Kit available on GitHub github.com...

Phys.org, May 17, 2017 – IBM builds its most powerful universal quantum computing processors.

-and-

ArsTechnica – Bigger is better: Quantum volume expresses computer’s limit.

The Ars Technica article explains how to measure quantum computing by volume rather than just by qubit number. There is a lot of error correction that goes on with quantum computing (there are still super-cooled computers down near absolute zero) and IBM is arguing that over all volume of quantum computing operations is what should be measured. Which is funny because their new 16-qubit machine is only a modest gain over their 5-qubit one assuming the 5q has perfect fidelity (which it doesn’t). This is also a universal quantum computer which blows that D-Wave annealing machine out of the water.

Anyway, these types of announcements are bound to happen through the end of the year. The goal is "quantum supremacy" where a quantum computer will out pace a supercomputer. IBM has plans to make a 50-qubit computer by the end of the year.

Here is round one!


So here we go again, instead of bits, kilobits, megabits and gigabits, we'll have to sit through the ramp up line of 5 qbits, 16, qbits and 50 qbits, 128 qbits, 256 qbits.....1 mega qbits and so on and on...PLEASE IBM (or Chinese equivalent) and just go straight to the 1 giga qbit and do the world a favour this time around.



posted on May, 17 2017 @ 01:13 PM
link   
D-wave have a 87+ Q-bit system, used to solve Ramsey numbers. That was back in 2012

www.technologyreview.com...

They are onto 2000 q-bits now

www.dwavesys.com...



posted on May, 17 2017 @ 01:30 PM
link   
Does anyone other than me feel we are going in the wrong direction, shouldn't we be learning how to evaluate things instead of typing things into a computer to get the answer? You know, we have to ask the computer the right question, that takes intelligence too. Will this computer be able to roll it's eyes at you to inform you you are looking in the wrong direction or will it allow you to follow the wrong path?



posted on May, 17 2017 @ 01:42 PM
link   
a reply to: rickymouse

the computer is the next stage of evolution for mankind



posted on May, 17 2017 @ 01:48 PM
link   
yeah but..

can it play games?



posted on May, 17 2017 @ 01:56 PM
link   

originally posted by: stormcell
D-wave have a 87+ Q-bit system, used to solve Ramsey numbers. That was back in 2012

www.technologyreview.com...

They are onto 2000 q-bits now

www.dwavesys.com...


I was going to post the same thing.


If the OP is right they are severely lagging behind D-wave.

In this video they talk about a 1000 qbit chip back in 2015.

Here is a pic of the new 2000 qbit D-wave.


I'll just throw this video in because I find it fascinating, and a little chilling, TBH.
"Geordie Rose - Quantum Computing: Artificial Intelligence Is Here"
You can skip to about 1:25 to avoid intro.


"If you can stand next to one of these machines when it is running it is awe inspiring, it feels like an alter to an alien god"
Geordie Rose

edit on 5 17 2017 by stosh64 because: (no reason given)



posted on May, 17 2017 @ 02:39 PM
link   
It was pointed out to me that the IBM is a 'universal' quantum computer, thus a different machine then the D-wave is.

Here is an article I found to explain the difference if anyone else was unaware.
Google moves closer to universal quantum computer



posted on May, 17 2017 @ 03:14 PM
link   
a reply to: toysforadults

Qubit measuring contest.

IBM is shooting for "quantum supremacy" which is the point where even a meager 50-qubit quantum computer out performs today's super computers.

The Ars Technica article says "quantum volume" is what should be measured not just bare qubit numbers. Which is MysterX's quip is about--just cut to the chase.

I think that quantum volume is a valid measurement. Right now there is no standard.

 


a reply to: stormcell

The D-Wave is:

1 - Not a quantum computer.
2 - Not a universal quantum computer.


Quantum annealing (QA) is a metaheuristic for finding the global minimum of a given objective ...

Wikipedia


It has 2,000 qubits, yes. But it is set up to do "approximate" quantum modeling. IBM is the real deal.

I think they opened it up so computer scientists can put quantum algorithms through their paces.



posted on May, 17 2017 @ 03:48 PM
link   
a reply to: Jiggly

No. But it can do a bad @ss drunken walk!!


The algorithm is as follows:

1.Pick a random point on a filled grid and mark it empty.
2.Choose a random cardinal direction (N, E, S, W).
3.Move in that direction, and mark it empty unless it already was.
4.Repeat steps 2-3, until you have emptied as many grids as desired.

The Drunkard Walk guarantees connectivity from the first grid picked, and you can also guarantee that a set percentage of the grid has been carved out. Unless the grid is large, you should bias the direction chosen towards the centre of the grid as the drunkard walk may end up butting up against the edges unnaturally. You may also choose to bias the drunkard walk to choose the last direction it travelled to create longer corridors.

pcg.wikidot.com - Drunkard Walk

Seems rather harmless enough. But as the choices scale up, a common computer bogs down. A quantum computer can do multiple directions at the same time.

These "games" are "algebraic computations" which uses discrete math to notate. We are talking things like algorithms that take N! (N factorial) steps versus on that is n(log n) steps. Games, and counting are very good at measuring the given algorithm.

Since quantum computers are new the researchers would like to know how this will all wok out. My bet is, a clever monkey is going to figure out a method to bridge the qubits together effectively doubling them.



posted on May, 17 2017 @ 04:19 PM
link   
GAH!

I made this thread: Quantum Zeno Effect, Counterfactuality, and The Weeping Angels.

In QM world, you can effectively "freeze" a state by repeatedly measuring it. That is the Quantum Zeno effect.


The quantum computer is set in a superposition of "not running" and "running" states by means such as the Quantum Zeno Effect. Those state histories are quantum interfered. After many repetitions of very rapid projective measurements, the "not running" state evolves to a final value imprinted into the properties of the quantum computer. Measuring that value allows for learning the result of some types of computations such as Grover's algorithm even though the result was derived from the non-running state of the quantum computer.

Wikipedia - Counterfactual Quantum Computation.

You can run quantum algorithms on a non-running quantum computer!

Besides counterfactual it is counterreasonable!! Schrodinger's cat makes an appearance. As does the universe next door...

Wikipedia - Schrödinger's cat.

A Copenhagen-type or many-worlds person?

Me, I just cannot seem to escape the Quantum Zeno effect! Prolly one of those dang angles touched me...
edit on 17-5-2017 by TEOTWAWKIAIFF because: sticky keys



posted on May, 17 2017 @ 05:01 PM
link   

Each classical bit has a definite value; it can only be 0 or 1; it can be copied without changing its value; it can be read without changing its value; and, when left alone, its value will not change significantly. Reading one classical bit does not affect other (unread) bits. You must run the computer to compute the result of a computation. Every one of those statements is false for qubits, even that last statement! There is a further difference. For a classical computer, the process is: Load → Run → Read, whereas for a quantum computer, the steps are: Prepare → Evolve → Measure.

sciencedirect.com - The quantum in your materials world.

That is a pretty good summary! I like the compare and contrast aspect of classical computing vs. quantum computing.

2^16 = 65,536 (the old memory term is "64 K") if you add another qubit in, you have 2^17 = 131,072.

2^n is the number of states your qubit can have, n is the number of qubits. Supremacy is estimated at 50 qubits. Which is a very large number (2^50)!



posted on May, 17 2017 @ 06:13 PM
link   

Quantum Experience has so far attracted about 40,000 users from more than 100 countries. Chuang, for example, used it in an online, graduate-level class on quantum computing that he taught late last year, so that students could practise programming an actual quantum computer.

The system’s users have performed 275,000 experiments and produced about 15 research papers.

ScientificAmerica. - IBM Will Unleash Commercial "Universal" Quantum Computers This Year.


"We don’t just want to build these machines,” Jerry Chow, the manager of IBM’s Experimental Quantum Computing team told Wired. “We want to build a framework that allows people to use them.”

How easy it will be to translate the skills learned in one of these companies’ proprietary quantum computing ecosystems to another also remains to be seen, not least because the technology at the heart of them can be dramatically different. This could be a further stumbling block to developing a solid pool of quantum programmers.

Ultimately, the kinds of large-scale quantum computers powerful enough to be usefully put to work on real-world problems are still some years away, so there’s no need to panic yet. But as the researchers behind Google’s quantum effort note in an article in Nature, this scarcity of programming talent also presents an opportunity for those who move quickly.

singularityhub.com, May 9, 2017 - Quantum Computing Demands a Whole New Kind of Programmer.

I really should sign up for IBM Q access. I would also have to learn LaTex. I should probably learn it anyway. I can't find my Discrete Math textbook. Bummer! I have mostly forgotten computational algebra. I know there was some in my Discrete Math book (I was looking for "n-choose" symbol usage and examples). *sigh*

But hey, it is nice to see that a lot have signed up and are running algorithms! Even if it was only on 5-qubits.



posted on May, 17 2017 @ 06:32 PM
link   
a reply to: TEOT

Hey, TEOT! It might not be that bad!


QCL (Quantum Computation Language) is one of the first implemented quantum programming languages.

Syntax

Data types Quantum - qureg, quvoid, quconst, quscratch, qucond
Classical - int, real, complex, boolean, string, vector, matrix, tensor

Function types qufunct - Pseudo-classic operators. Can only change the permutation of basic states.
operator - General unitary operators. Can change the amplitude.
procedure - Can call measure, print, and dump inside this function. This function is non-invertible.

Built-in functions Quantum qufunct - Fanout, Swap, Perm2, Perm4, Perm8, Not, CNot
operator - Matrix2x2, Matrix4x4, Matrix8x8, Rot, Mix, H, CPhase, SqrtNot, X, Y, Z, S, T
procedure - measure, dump, reset

Classical Arithmetic - sin, cos, tan, log, sqrt, ...
Complex - Re, Im, conj

-and-


QML is a Haskell-like quantum programming language by Altenkirch and Grattage

Wikipedia: Quantum Programming

Haskell?!... "This is a UNIX system. I know this!"

And people said, "Hey TEOT! If you sit around playing with Haskell you'll end up talking to yourself!" Who's laughing now? I'm half-way there on QML. Another 3-letter acronym for your CV!!




edit on 17-5-2017 by TEOTWAWKIAIFF because: pronouns are a pain but very meaningful



posted on May, 17 2017 @ 06:34 PM
link   

originally posted by: rickymouse
Does anyone other than me feel we are going in the wrong direction, shouldn't we be learning how to evaluate things instead of typing things into a computer to get the answer? You know, we have to ask the computer the right question, that takes intelligence too. Will this computer be able to roll it's eyes at you to inform you you are looking in the wrong direction or will it allow you to follow the wrong path?


A computer is just a way to evaluate complex questions. It's not magic, even though many treat it as such.



posted on May, 17 2017 @ 06:44 PM
link   

originally posted by: Aazadan

originally posted by: rickymouse
Does anyone other than me feel we are going in the wrong direction, shouldn't we be learning how to evaluate things instead of typing things into a computer to get the answer? You know, we have to ask the computer the right question, that takes intelligence too. Will this computer be able to roll it's eyes at you to inform you you are looking in the wrong direction or will it allow you to follow the wrong path?


A computer is just a way to evaluate complex questions. It's not magic, even though many treat it as such.


Too many people treat it like Magic. When disputing a bill, the people you talk to can't make the computer do what needs to be done sometime. So then the Manager needs to be called in and they just shake their head and tell the workers to just change the information on the screen. They can't do it the way they are taught, meanwhile the people get frustrated and leave. The programs are designed to do that, the programmer often builds in some stuff that makes the discounts not work right.



posted on May, 17 2017 @ 07:25 PM
link   

originally posted by: rickymouse
Too many people treat it like Magic. When disputing a bill, the people you talk to can't make the computer do what needs to be done sometime. So then the Manager needs to be called in and they just shake their head and tell the workers to just change the information on the screen. They can't do it the way they are taught, meanwhile the people get frustrated and leave. The programs are designed to do that, the programmer often builds in some stuff that makes the discounts not work right.


It's my opinion that cash registers + cashiers are an example of computers done poorly. There is little advantage to them, other than counting the correct change... a basic math skill that you should have for other reasons. There are many reasons to use a computer, but using one as a simple calculator is not one of those.



posted on May, 18 2017 @ 04:57 AM
link   
a reply to: Aazadan

those networked cash registers are also keeping track of the items in store so they know what to order and when. You needed to check every article in store, in the back-store and then figure out how much to order. Those systems do help a great deal.




top topics



 
13
<<   2 >>

log in

join