Wed, 15 Apr 2015, 00:58



A former student wrote to me today, asking about Quantum Computing.
I thought I would share my answer with you.  If you have too much
on your plate, feel free to delete this, or save it for when you
have free time.  I have reserved a lecture to talk about some of
the "different" things going on in computer architecture right now,
and quantum computing is one of them, often accompaned by far too
much hype.  So I will talk about it then.  Meanwhile, for those 
who are interested in his question, here:

> > Hey Dr. Patt,
> > 
> > Hope your semester has been going well.
> > 
> > I'm currently taking Banerjee's course in solid state devices and I love
> > it!  I especially love the (brief) overview on quantum physics and
> > mechanics.  This class is my first exposure to quantum physics and I want
> > to learn more.
>  
> Then I suspect physics is the right place for you!
> 

> > I do plan on doing PhD graduate school and focus my research in the field
> > of analog/RF/THz microelectronics and circuits, but at the same time, the
> > world of quantum computing is also very appealing.
> >
> > So I have two questions:
> > 
> > 1) Coming from a background in analog electronics, how relevant is quantum
> > computing to me?  I'd imagine a lot about quantum computing mostly relate
> > towards the computer engineering track (i.e. comp arch), but at the same
> > time, I do recall analog circuits is heavily utilized to leverage the
> > quantum mechanics.
>  
> First of all, don't listen to all the hype about quantum computing.
> No one knowledgeable, in my view, is advocating quantum computing as a
> replacement for what we are used to.  However, there are some special
> problems that may lend themselves to using quantum computing to solve them.
> That is, most of the peripheral work will be done as usual, but there may be,
> if the quantum computing people can build it, a special purpose piece of
> hardware to handle the execute phase of the quantum computation.  Think of
> it as a very complex instruction called QUANTUM that has the arguments it
> needs as part of the instruction.  In the same way a LD would do its thing
> to the memory system, a QUANTUM would do its thing to the quantum computer.
> the quantum computer would then carry out the work required, and respond
> with the result.  Think of it as an accelerator, an attached processor.
> Microsoft is currently experimenting with building one, and more importantly
> from our point is they recognize all the headaches in interfacing this 
> accelerator to the rest of the computer.  It has to be done, and it will
> require a lot of the stuff you are currently thinking about.  I do not know
> when this will be real, but a lot of smart people are trying to make it
> happen.  It will take talents in a lot of sub-areas of ECE.
> 

> > 2) I see UT's physics department offers a series of courses on quantum
> > mechanics and I want to take them, starting with PHY 373.  Is it worth the
> > time to take pure quantum physics courses and would it benefit towards EE
> > research?  Or are the opportunities in quantum/EE already depleted?
> >
> > Thanks,
> > <>
>
> Whether the physics sequence is something for you to do is a question for
> one of the Microelectronics faculty, better than for me.  Speak to Professor
> Banerjee, for example.  I am not sure quantum mechanics will be much direct
> help in what you want to do.  On the other hand, having a deeper understanding
> of as many fundamental things as you can acquire has got to give you a 
> stronger foundation to tackle things you have no idea about yet.
> 
> As to your question about opportunities in quantum/EE already depleted,
> clearly the answer to that is no.  We do not know if we can make it work,
> so we do not know what opportunities will be available downstream.  I would
> not the family farm on it.  But it could be very important for a certain
> class of problems, if we can make it work.  And making it work is what
> creates opportunities. Certainly not depleted.

> Hope this helps.
> Good luck pursuing it.
> Yale Patt