Hi there, We're on the threshold of some of the biggest upheavals in computing technology since the invention of the microchip. Put simply, Moore's Law as we know it is coming to an end. The Intel founder's dictum that the number of transistors on a chip doubles every couple of years is bumping up against the hard limits of physics. For computers to keep getting more powerful, we need entirely new ways to design, build, deploy, and interact with them. Future Compute on December 2nd and 3rd is a new conference from MIT Technology Review where leading experts will map out this new era in computing and what it means for your business. On day one we'll look at new materials and chip designs to replace the silicon wafer; expect to hear about carbon nanotubes, 3D architectures, AI and neuromorphic chips. We'll move on to changes in how people use computers, like augmented reality and brain-computer interfaces, and in where computers live—cloud computing, "edge" computing, and the internet of things. And we'll cover the new business opportunities, cybersecurity challenges, workforce issues, and even geopolitical implications as countries and companies race to master these new technologies. On day two we'll dive deep into the world of quantum computing. Google announced last month that it had achieved "quantum supremacy"—a small but crucial step towards machines capable of tasks a normal computer could never achieve, opening up entirely new vistas in industries like drug discovery, materials science, and aerospace design. We've got people from the leading firms like IBM, Google, Microsoft and D-Wave explaining what quantum computing is (and isn't), how it'll be used, and where future business opportunities could lie. Sign up here and come join us on December 2nd and 3rd. I promise you'll leave with your head spinning, your mind packed full of ideas and possibilities, and a broader, clearer view of one of the most important technological shifts of our time. Gideon Lichfield Editor-in-chief MIT Technology Review |