Even hex breaks back down into a binary format. Looks like binary are our protons and electrons of computing . . . maybe one day a nirvana method will arise . . . one form of code that just is and is all things.
2006-10-03 20:52:41
·
answer #1
·
answered by Matthew 2
·
0⤊
0⤋
Multi-state chips are possible, but so much time and effort has been invested in the utilization of the flip flop, and the programming is so familiar, that as long as it can advance in speed, even if through complexity, it is the likely path to the future for some time. There is also quantum computing, but I believe that to be binary, I am not sure.
2006-10-03 20:48:29
·
answer #2
·
answered by Anonymous
·
0⤊
0⤋
it is already midnight......
there is nothing more efficient and better than anaogue technology.
Digital is easier longer lasting, by the book made for "stupids"
even in the CD crystal clear sound that you get it is not match for a goos analogue vinyl; disk.
it is said that the harmonics of sound that actually make the difference inthe song and act subconciously do not exsist. yes it might be clear but stil it is have the feeling .
I AGREE it is easier and user friendly but anologue tech really is the best. it is harder to use and to set up you need experience to do so and knowldge and tha tis why digital flourished so much.
you said it yourself it is all 1 and 0. if you are relevant in digitisation you will never actully compose the perfect wave form to reproduce. Ok it might not make the difference but it is not a real things. all this loops of yes and no all this 0 and 1 is bopth a smart and idiotic way of thinking. usuealy in "heavy" applications hybrid techonogies are better where routins are perofrmed by digital electronics. i do not say that they are crap but they are overestimated epsecially compared to anoalogue electronics.
now what can we do in order to make this step ahead are innovating applications in order to push to new technologies, ;)
2006-10-03 21:21:25
·
answer #3
·
answered by Emmanuel P 3
·
0⤊
0⤋
After those $800 lavatory seats for the B-fifty two and the 5,000 greenback crash-survivable flashlights for the B-2, anybody have been given the thought that the govenrment became into manned via a gaggle of chumps, so the contractors, Haliburton etc, take advamtage of our gov on a daily basis and two times on Sunday.
2016-12-26 09:05:41
·
answer #4
·
answered by purinton 3
·
0⤊
0⤋
Quantum computing?
A quantum computer is any device for computation that makes direct use of distinctively quantum mechanical phenomena, such as superposition and entanglement, to perform operations on data. In a classical (or conventional) computer, the amount of data is measured by bits; in a quantum computer, it is measured by qubits. The basic principle of quantum computation is that the quantum properties of particles can be used to represent and structure data, and that quantum mechanisms can be devised and built to perform operations with these data.
A quantum computer maintains a set of qubits. A qubit can hold a one, or a zero, or a superposition of these. A quantum computer operates by manipulating those qubits, i.e. by transporting these bits from memory to (possibly a suite of) quantum logic gates and back.
Though quantum computing is still in its infancy, experiments have been carried out in which quantum computational operations were executed on a very small number of qubits. Research in both theoretical and practical areas continues at a frantic pace, and many national government and military funding agencies support quantum computing research to develop quantum computers for both civilian and national security purposes, such as cryptanalysis.
It is widely believed that if large-scale quantum computers can be built, they will be able to solve certain problems faster than any classical computer. Quantum computers are different from classical computers such as DNA computers and computers based on transistors, even though these may ultimately use some kind of quantum mechanical effect (for example covalent bonds). Some computing architectures such as optical computers may use classical superposition of electromagnetic waves, but without some specifically quantum mechanical resource such as entanglement, they do not share the potential for computational speed-up of quantum computers.
2006-10-03 20:47:22
·
answer #5
·
answered by Puzzling 7
·
0⤊
0⤋
hex has made binary obsolete in some programs....
2006-10-03 20:48:36
·
answer #6
·
answered by Anonymous
·
0⤊
0⤋