Welcome!

Nano-scale transistors fill warehouse-scale supercomputers, yet their performance still constrains development of the jets that defend us, the medical therapies our lives depend upon, and the renewable energy sources that will power our generation into the next. The AI-Sci: The Scientific AI Collaborative develops computational models and numerical methods to push these applications forward. We accompany our methods with algorithms crafted to make efficient use of the latest exascale machines and computer architectures, including AMD GPUs, Arm/RISC CPUs, and quantum computers. We develop open-source software for these methods that scales to the world’s largest supercomputers. Check out the rest of this website to learn more.

PI: Spencer Bryngelson
Assistant Professor
College of Computing, CSE
College of Engineering, AE/ME
Georgia Tech

Openings? Visit this page if you’re interested in joining our group.

Examples of our work

Bubble cavitation and droplet shedding are fundamental multiphase flow problems at the core of naval hydrodynamics, aerospace propulsion, and more. We developed a sub-grid method for simulating these phenomena. MFC, our open-source exascale-capable multi-phase flow solver, demonstrates such scale-resolving simulation of a shock-droplet interaction in the above video (via Ph.D. student Ben Wilfong).

The spectral boundary integral method leads to high-fidelity prediction and analysis of blood cells transitioning to chaos in a microfluidic device. This method of simulation provides resolution of strong cell membrane deformation with scant computational resources (above). We developed a stochastic model for the cell-scale flow, enabling microfluidic device design and improving treatment outcomes.

News

18 July, 2024

Learning path to AGI, PINNs, QINNs and Quantum Computing
Developing an application that integrates Artificial General Intelligence (AGI), Physics-Informed Neural Networks (PINNs), Quantum Computing, and Quantum-Informed Neural Networks (QINNs) is an ambitious and interdisciplinary endeavor. Here’s a comprehensive guide to the prerequisites, technologies, and learning paths:

Prerequisites

Mathematics

  • Linear Algebra
  • Calculus (including differential equations)
  • Probability and Statistics
  • Optimization
  • Complex numbers and quantum mechanics fundamentals (for Quantum Computing and QINNs)

Computer Science

  • Algorithms and Data Structures
  • Programming Languages (Python, C++, potentially quantum-specific like Q#)
  • Software Engineering Principles

Physics (for PINNs and a deeper understanding)

  • Classical Mechanics
  • Quantum Mechanics
  • Statistical Mechanics
  • Computational Physics

Machine Learning and AI

  • Supervised and Unsupervised Learning
  • Reinforcement Learning
  • Neural Networks and Deep Learning

Learning Paths

Artificial General Intelligence (AGI)

Foundational Knowledge:

  • Mathematics and Statistics
  • Computer Science Basics
  • Cognitive Science and Neuroscience Fundamentals

Core AGI Topics:

  • Advanced Machine Learning
  • Deep Learning Architectures
  • Reinforcement Learning
  • Meta-Learning and Transfer Learning

resources:

  • "Artificial Intelligence: A Modern Approach" by Stuart Russell and Peter Norvig
  • "Deep Learning" by Ian Goodfellow, Yoshua Bengio, and Aaron Courville
  • Online courses from Coursera, edX, and MIT OpenCourseWare

Physics-Informed Neural Networks (PINNs)

Foundational Knowledge:

  • Differential Equations
  • Numerical Methods
  • Basics of Machine Learning

Core PINN Topics:

  • Formulating Physical Laws as Neural Network Constraints
  • Solving Differential Equations using PINNs

resources:

  • Papers and tutorials by George Em Karniadakis and other leading researchers
  • Online courses on PINNs and Scientific Machine Learning

Quantum Computing

Foundational Knowledge:

  • Quantum Mechanics
  • Linear Algebra and Complex Numbers
  • Basics of Quantum Computing

Core Quantum Computing Topics:

  • Quantum Gates and Circuits
  • Quantum Algorithms (e.g., Grover's and Shor's algorithms)
  • Quantum Error Correction

resources:

  • "Quantum Computation and Quantum Information" by Michael Nielsen and Isaac Chuang
  • Online courses from IBM Quantum Experience, Coursera, and edX

Quantum-Informed Neural Networks (QINNs)

Foundational Knowledge:

  • Basics of Quantum Computing and Neural Networks

Core QINN Topics:

  • Quantum Neural Network Architectures
  • Hybrid Classical-Quantum Models
  • Variational Quantum Algorithms

resources:

  • Research papers on QINNs and Variational Quantum Algorithms
  • Tutorials and documentation from quantum computing platforms like IBM Qiskit and Google's TensorFlow Quantum

Technologies and Tools

Programming Languages

  • Python (widely used in AI/ML, PINNs, and quantum computing)
  • Q# (for quantum computing - Microsoft)
  • Julia (for numerical and scientific computing)

Frameworks and Libraries

  • TensorFlow/PyTorch: For deep learning and neural networks.
  • DeepXDE: For PINNs.
  • Qiskit: For quantum computing (IBM).
  • Cirq: For quantum computing (Google).
  • PennyLane: For quantum machine learning.

Computational Platforms:

  • Google Colab: For quick prototyping and experiments.
  • IBM Quantum Experience: For running quantum algorithms on real quantum hardware.
  • AWS Braket: For accessing quantum computing resources.

Integrative Learning Path:

  • Undergraduate and Graduate Studies:
    • Major in Physics, Computer Science, or related fields.
    • Take courses in quantum mechanics, machine learning, and computational physics.
  • Self-Directed Learning:
    • Online courses, MOOCs, and workshops on AGI, PINNs, and quantum computing.
    • Follow cutting-edge research and stay updated with the latest developments.
  • Practical Experience:
    • Work on interdisciplinary projects combining these fields.
    • Contribute to open-source projects and collaborate with research communities.
    • Participate in competitions and hackathons focused on AI and quantum computing.
  • Networking and Collaboration:
    • Join professional organizations and attend relevant conferences.
    • Collaborate with researchers and industry professionals in these fields.

By following this comprehensive learning path, you can acquire the skills and knowledge needed to develop advanced applications integrating AGI, PINNs, quantum computing, and QINNs.


12 May, 2024 Our paper, Method for portable, scalable, and performant GPU-accelerated simulation of multiphase compressible flow, was accepted to Computer Physics Communications. Congrats to Anand, Henry, and Ben!

9 May, 2024 Congratulations to Jesus and Ben, who passed their Ph.D. qualifying exams!

6 May, 2024 Subrahmanyam, resident sickle cell dynamics expert, graduates with his BSCS and heads to a snazzy industry gig. Congratulations, Sub!

4 May, 2024 Max starts his NVIDIA summer internship. Congrats, Max!

8 April, 2024 Congraulations to Suzan on winning the PURA Salary Award for research this summer, and Anshuman on our latest publication: Neural networks can be FLOP-efficient integrators of 1D oscillatory integrands.

4 March, 2024 We are at the 2024 APS March Meeting! We have talks on simulating fluid flow on quantum devices as well as exascale machines like Frontier.

23 February, 2024 Dr. Tianyi Chu joins the group as a postdoc. Welcome, Tianyi!

19 February, 2024 Spencer gains coutesy appointment in Georgia Tech’s Woodruff School of Mechanical Engineering.

13 February, 2024 MFC has been accepted to the second round of the Oak Ridge Frontier Hackathon! MFC scales to 100% of the world’s largest computer, but extracting maximum performance still needs attention. We look forward to working on it!

9 February, 2024 Spencer gives an invited talk at the 2024 CRNCH Summit on CFD on an existing IBM quantum device. Great event!

… see all News