Building and training large language models (LLMs) for scientific discovery pose significant technical challenges, often beyond the resources of most organizations. Multi-institutional collaborations are essential to ensuring progress. Led by Argonne, the international Trillion Parameter Consortium has organized a half-day workshop to accelerate the development and use of generative AI for science and engineering. Speakers include Argonne’s Rick Stevens, Valerie Taylor and Charlie Catlett with colleagues from Canada, Japan and Spain.
In another session, researchers Hongwei Jin and Krishnan Raghavan are exploring how LLMs can automatically identify unusual activity in computing processes, making systems more reliable and secure. And Anshu Dubey will co-lead an interactive Birds of a Feather session on the use of foundational LLM technologies for different high performance computing targets.