Expertise: Operating & Distributed Systems; Programming Languages
Stephanie Wang joined the faculty of the Paul G. Allen School of Computer Science & Engineering at the University of Washington in fall 2024. Her research is primarily in distributed systems. She is especially interested in how we can build intermediate abstractions that make it easier to build high-performance, fault-tolerant, and domain-specific systems. Along those lines, she is interested in domains where large-scale application development is still pretty hard, such as machine learning and data processing. She’s also interested in programming languages as the interface that can improve distributed systems.
Previously, Wang was a Ph.D. student in the RISELab at UC Berkeley, where she was advised by Ion Stoica. She is also a co-creator and committer for the open-source project, Ray, which has been used to train ChatGPT, serve high-performance LLMs, and break the CloudSort 100TB record. She subsequently continued to develop the Ray ecosystem as a software engineer at Anyscale, working primarily on Ray Data, a system for distributed data preprocessing for ML.