Microsoft announced the release of PyRIT, an open-access red teaming tool designed to help identify risks in generative AI through automation. PyRIT enhances audit efficiency by automating tasks and highlighting areas needing further investigation. It’s aimed at security professionals and ML engineers, controlling AI red team operations and generating additional harmful prompts based on initial sets. PyRIT is available on GitHub for industry peers to adopt for their generative AI applications.

Relevant URL: