IT Matters is a speaker series hosted by Kevin Boyd, associate vice president and CIO. During these sessions, Kevin welcomes guests to the IT Matters stage to discuss the important work of the University and the key role information technology plays in enabling and advancing the efforts of faculty, students, staff, and community partners.
Please join us online for the next session of the IT Matters speaker series:
Date: Monday, October 25, 2021
Time: Noon to 2 p.m.
Bring your lunch and log on to Zoom for this engaging discussion. You must register using a valid UChicago email address. Please submit questions in advance to firstname.lastname@example.org.
About the Speakers
Marynia Kolnan is Associate Director for Health Informatics in the Center for Spatial Data Science. She is a health geographer using open science tools and an exploratory data analytic approach to investigate issues of equity across space and time. Her research centers on how “place” impacts health outcomes in different ways, for different people, from opioid risk environments to chronic disease clusters. She focuses on quantifying and distilling the structural determinants of health across different environments, tying political ecology models of public health with geocomputational methods and quasi-experimental policy evaluation techniques. In addition to her role at the University of Chicago, Marynia serves as an Health and Medical Specialty Group (AAG) board member, and chair of the Chicago Public Health GIS Network.
Sanjay Krishnan is Assistant Professor in the Department of Computer Science. His research focuses on applications of machine learning and control theory to computer and cyber-physical systems problems. His group explores the intersection of AI and data science—towards a world where intelligent systems can automatically perform many of the data analytics tasks we currently expect humans to do. By designing “intelligent learning systems,” that automatically adapt and optimize to new data and tasks, Asst. Prof. Krishnan hopes to create more flexible and powerful systems that enable visions of a more autonomous world.
Julian Soloway is the Walter L. Palmer Distinguished Service Professor of Medicine and Pediatrics; Director, Institute for Translational Medicine; Dean for Translational Medicine for the Biological Sciences Division; Vice Chair, Research for Department of Medicine; and Chair, Committee on Molecular Medicine. He is an expert in pulmonary medicine with a particular interest in the management of severe and persistent asthma. Under his leadership, researchers, scientists and clinicians work to understand the causes of disease and to bring new therapies to the public.
Dr. Solway’s research focuses on the structure and function of smooth muscle, the tissue that encircles the bronchial tubes. When irritated, smooth muscle contracts, narrowing these air passages and making it hard to breathe. By studying this process, Dr. Solway hopes to find more effective treatments for asthma. His work has been extensively funded by the National Institutes of Health (NIH).
Samuel Volchenboum is Associate Professor of Pediatrics; Dean of Master’s Education; Associate Chief, Research Informatics Officer; and Associate Director, Institute for Translational Medicine. He is an expert in pediatric cancers and blood disorders. He has a special interest in treating children with neuroblastoma, a tumor of the sympathetic nervous system. In addition to caring for patients, Dr. Volchenboum studies ways to harness computers to enable research and foster innovation using large data sets. He directs the development of the International Neuroblastoma Risk Group Database project, which connects international patient data with external information such as genomic data and tissue availability. The Center he runs provides computational support for the Biological Sciences Division at the University of Chicago, including high-performance computing, applications development, bioinformatics, and access to the clinical research data warehouse.
Friday, January 29 | 10:00 am
Quantum physics is creating the potential to create quantum computers and networks that may one day greatly outperform current computers. Our panel of four researchers talked about their work in quantum computing and how it could change the work we do in IT. During this event, we talked with researchers Diana Franklin, Liang Jiang, Robert Rand, and David Shuster.
The recording of the event is available online.
About the Speakers
Associate Professor in Computer Science
Diana Franklin leads five projects involving computer science education involving students ranging from 3rd grade through university. She is the lead Principal Investigator (PI) for quantum computing education for EPIQC, a National Science Foundation (NSF) expedition in computing. Her research agenda explores ways to create curriculum and computing environments in ways that reach a broad audience. Her research interests include computing education research, architecture involving novel technologies, and ethnic and gender diversity in computing. She is the author of “A Practical Guide to Gender Diversity for CS Faculty,” from Morgan Claypool.
Professor of Molecular Engineering in the Pritzker School of Molecular Engineering
Liang Jang investigates quantum systems and explores various quantum applications, such as quantum sensing, quantum transduction, quantum communication, and quantum computation. His efforts are helping to make quantum computing and communication technology scalable and more accessible. Professor Jiang is Principal Investigator (PI) for the Jiang Group, which investigates quantum control and quantum error correction to protect quantum information from decoherence for various physical platforms. He has worked on modular quantum computation, global-scale quantum networks, room-temperature nano-magnetometer, sub-wavelength imaging, micro-optical quantum transduction, and error-correction-assisted quantum sensing and simulation.
Assistant Professor in Computer Science
Robert Rand is part of the small but growing community of researchers creating quantum programming languages. Today’s programmers of classical computers have a vast library of languages they can use, from high-level languages such as Python and Java to more targeted languages for specific tasks such as working with databases or spreadsheets. But quantum computing languages sit closer to the early, pre-Fortran days of computer science, where many options exist but no consensus has emerged. Professor Rand authored an online, interactive textbook, “Verified Quantum Computing,” which takes a mathematical approach to its topic, teaching concepts by asking students to prove theorems, which can then be verified by an automated proof assistant.
Associate Professor in the Department of Physics at the James Franck Institute, and the College
David Schuster is the Principal Investigator (PI) for the Schuster Lab, which specializes in quantum information, with research efforts in quantum computing, hybrid quantum systems, and quantum simulation. His research focuses on understanding and controlling the unique properties such as superposition and entanglement of quantum systems in a variety of platforms.
Q and A with the Panelists
What are the current theories around overcoming the decoherence (measurement) problem? Are there theories that posit decoherence as a step in the algorithm of quantum computation?
Diana Franklin: Decoherence is a bit separate from measurement. The belief is that measurement will *always* destroy the state. Decoherence is when the state changes even in the absence of measurement . Decoherence is somewhat an engineering problem – they are trying to better fabricate and protect devices so they have less decoherence. On the other hand, some are working on algorithms that know about the types of errors that occur in the hardware.
Liang Jiang: Sometimes we may also engineer decoherence to assist quantum information processing or error correction, which is an active research direction.
Can you explain more about how probability fits into the field of quantum physics?
Diana Franklin: When you have a superposition, that is a number of simultaneous states at once (in a single qubit, just two states, but in n qubits, it’s 2^n states). However, you can only ever read out one of those 2^n states, and the one you measure is a probabilistic outcome. So there is a certain probability of reading out each state, and quantum operations modify those probabilities.
Does Quantum Computing bring the calculation of P vs. NP problems into reality?
David Schuster: It is very unlikely that it is possible (though we don’t even know for sure that NP is not P). But it is thought there is a class of problems (BQP) that are soluble on a quantum computer but are not in P.
For quantum entanglement, is there a distance limitation? Could it be used for better communication to some of the research probes or satellites launched in our solar system?
Diana Franklin: Theoretically, there is not a distance limitation. However, because entanglement and superposition are at the quantum level, a very small disturbance can modify the state. So it’s not distance that’s the problem – it’s the time and environment at which the distance is traveled.
Robert Rand: It’s probably worth noting that quantum entanglement cannot be used to send information faster than light. This is surprising, since what happens to one qubit happens to the other simultaneous, but the destructive properties of measurement prevent actual communication without using classical communication as well.
Is the architecture of the current quantum computers being built (Google, IBM, etc) the same/different/similar, etc?
David Schuster: Google and IBM are pretty similar (though details matter), grids of superconducting circuits but there are also ones that have entirely different platforms such as atoms, or electrons, and also are architected fairly different in how they interact.
Is the quantum computer thought as a support tool, rather than the main computing device?
Diana Franklin: Yes, definitely. It is not anticipated to ever replace general-purpose computers.
Is there a reason why a single qbit has two states as opposed to 4, 8 or multiple states?
Diana Franklin: There is something called a qutrit that can have three states, as opposed to the traditional qubit, which has two. This depends on the architecture, but in superconducting machines, it’s for the same reason that bits have two states: binary is convenient for computing.
Robert Rand: This depends on the architecture, but in superconducting machines, it’s for the same reason that bits have two states: binary is convenient for computing.
From a recent video lecture, entanglement was described as a merging of the wave components of interacting particles. Is that a plausible model? How does entanglement enable something useful in computing?
Diana Franklin: Entanglement is actually required to do even the simplest quantum calculations. If you think about getting qubits to cooperate together to form a variable, you need entanglement. For example, let’s imagine that you have 3 bits that can store a single number from 0-7 in a classical computer but, when in superposition, multiple numbers simultaneously with some probability of reading out each number. I’d like to store 000 with 25% probability and 111 with 75% probability. Without entanglement, that means that each individual qubit would have 75% probability of reading a 1. Which means that the number 101 could be read with 0.75 * 0.25 * 0.75 probability. But that’s not what we wanted. If the first measurement reads a 1, we want the other two to also measure a 1. That requires entanglement.
Has any thought been given to using fiction/science-fiction to help students understand the “non-intuitive” aspects of quantum physics? If so, is there a concern that the fictional elements will confuse the scientific?
Diana Franklin: Yes, people are thinking of creative ways to do things. Ant Man had a remarkably accurate (to a certain point) description of the Quantum Realm. I just put in a proposal to NSF to create an app that has several games that have analogies to quantum concepts. There are some videos from Europe about Quantum Kate that relates them to everyday concepts. I think if some major media company were to decide to do it, and got together the right team, we could make something really cool. We were trying to write comic books with comic book heroes with quantum-inspired skills, but it was too heavy of a lift, so we did zines instead.
Would we have any research grants/projects at UChicago that our panelists are working on either individually or collaboratively with others on the call?
Robert Rand: David Schuster and I are different parts of the same large grant (EPiQC). I lead the research on QC learning, and he is building the physical computer, and a bunch of other people are working on bridging the gap between algorithms that assume perfect hardware and incredibly imperfect hardware. It brings a computer science systems approach to bridging that gap through technology-aware compiler optimizations, detecting machine errors and mapping qubits with that knowledge, communication between qubits, specializing the program to particular nuances of hardware, etc.
If a resistor is such a good RNG, since classical computers are full of resistors, why haven't those been employed to give us better RNGs?
David Schuster: I think they have to some extent, but the main reason that its not more ubiquitous is power consumption, not the resistor itself but measuring the very small signal it produces.
How much does it cost to build a basic quantum computer today? Just an estimate, understanding costs can be impacted by both hardware and software needed to perform a specific task. Who has the largest or most powerful quantum computer in the world? What are the challenges faced with securing quantum computers vulnerabilties from being exploited?
David Schuster: Not including the R+D its probably ~$1-3M. This is largely due to “low volume.” Right now there are several types of quantum computers which all have a few tens of quantum bits. Most likely the challenges will be similar in terms of making sure there aren’t mistakes in the libraries, user behavior, etc.
Past IT Matters Events
Innovative Approaches to Remote Teaching and Learning at the University of Chicago
Wednesday, June 17 | 1:00 pm
During this event, Chief Information Officer, Kevin Boyd, talked with a panel of six instructors, who shared and discussed the tools and techniques they’ve used during the Spring Quarter transition to a remote environment.
About the Speakers
Heidi Coleman is a Senior Lecturer in Theater and Performance Studies and Founder of Chicago Performance Lab and has worked professionally as a director and dramaturg. She has initiated in a number of programs including New Work Week, Chicago Performance Lab, Exquisite Pressure: Pop Up Series, and co-curated the University of Chicago Presidential Fellows in the Arts Program as well as the TAPS Commissioning Project. For the past five years, she has co-directed alternate reality games for the University of Chicago: the parasite, Gaming the Future, and A Labyrinth.
Allyson Nadia Field is the Director of Graduate Studies and an Associate Professor in the Department of Cinema and Media Studies and the College. Her scholarship contributes to evolving areas of study that investigate the functioning of race and representation in interdisciplinary contexts surrounding cinema.
Patrick Jagoda is Professor of English and Cinema & Media Studies, and an affiliate of the Center for the Study of Gender and Sexuality. He specializes in digital media theory, game studies and design, and twentieth and twenty-first century American literature and culture. He is Executive Editor of Critical Inquiry, co-founder of the Game Changer Chicago Design Lab and Transmedia Story Lab, and director of the Weston Game Lab.
James Osborne is Assistant Professor of Anatolian Archaeology in the Department of Near Eastern Languages and Civilizations. He is an archaeologist who works in the eastern Mediterranean and ancient Near East focusing on the Bronze and Iron Ages. Methodologically, he incorporates quantitative methods like GIS, space syntax, and geochemical ceramic analysis with native historical and iconographic sources.
Jennifer Spruill is a Senior Lecturer in the Social Sciences Collegiate Division and Co-Chair of the Power, Identity, Resistance Core sequence.
Tuesday, January 21, 9:00 am to 11:00 am | Reva and David Logan Center for the Arts, Performance Hall
During this event, Kevin Boyd talked with four guests from Urban Labs. Urban Labs is comprised of five labs working to address challenges across five key dimensions of urban life: crime, education, health, poverty, and energy and environment. The guests will share their perspectives on the importance and role of institutions like the University of Chicago in research and how information technology helps Urban Labs with their research.
Martin Barron is the Director of Data and Analysis for the University of Chicago’s Crime and Education Labs. Martin leads a team of 25 research analysts and data scientists in their work on over 50 simultaneous research projects aimed to address some of Chicago’s most pressing social problems.
Jackson Reimer is a Project Associate for the Energy and Environment Lab, where he provides support on its projects related to water conservation and its portfolio with the US Environmental Protection Agency.
Leoson Hoay is a Research Analyst supporting the data modeling and analytics needs of the various projects in the Health Lab.
9:00 am to 9:30 am
Network and connect with colleagues
9:30 am to 11:00 am
Driving Research and Discovery
Thursday, May 16, 9:00 am to 11:00 am | Reva and David Logan Center for the Arts, Performance Hall
Kevin Boyd, associate vice president and CIO, welcomed Rob Gardner and Robert Grossman for an engaging conversation on research and high performance computing. The guests shared their perspectives on the importance and role of institutions like UChicago in supporting and driving research and discovery.
Rob Gardner is a Research Professor in the Enrico Fermi Institute at the University of Chicago working on computing and analytics for big science. He is Principal Investigator of the NSF-funded SLATE project which federates Kubernetes edge clusters in the national cyberinfrastructure. He co-leads the US ATLAS Computing Facility, a collaboration of 10 universities and two U.S. Department of Energy laboratories providing computational and data services to the ATLAS collaboration at the CERN Large Hadron Collider (LHC) in Geneva, Switzerland. He leads the Scalable Systems Laboratory of the newly created NSF Institute for Research and Innovation in Software for High Energy Physics (IRIS-HEP) with focus on novel data organizational methods, data delivery services and analysis platforms for the high luminosity upgrade of the LHC in 2026. His lab also builds advanced cyberinfrastructure supporting forefront scientific collaborations including the XENON, South Pole Telescope, VERITAS, LIGO and IceCube experiments.
Robert Grossman is a faculty member in the section of Genetic Medicine, as well as the chief research informatics officer for the Division of the Biological Sciences. He is also a senior fellow in the Computation Institute and the Institute for Genomics and Systems Biology. His research group focuses on bioinformatics, data mining, cloud computing, data intensive computing, and related areas. He is also the founder of Open Data Group, which has provided strategic consulting and outsourced services in analytics since 2002, specializing in building predictive models over big data. His current research is focused on bioinformatics, especially developing systems, applications, and algorithms so that large data sets of genomics data can be integrated with phenotype information extracted from electronic medical records and analyzed to deepen our understanding of diseases.
9:00 am to 9:30 am
Network and connect with colleagues
9:30 am to 11:00 am