Protegrity has launched AI Team Edition, a product designed to secure AI inferencing through what it describes as a zero model exposure approach.
The launch reflects growing concern over how companies can use sensitive business data in AI systems without handing that information directly to large models. The new product is intended to protect both data and organisational knowledge during inference, an area Protegrity argues is becoming more exposed as businesses adopt foundation models.
At the centre of the product is an approach that protects information before it reaches an AI system. This is intended to reduce the risk of theft, misuse, or distortion of sensitive material while still allowing organisations to run AI workflows and extract value from internal datasets.
The software uses semantic preserving encryption to protect the meaning and relationships within data as it moves through knowledge graphs and AI workflows. For AI systems, those links between data points can be as valuable as the raw information itself.
Built on a Kubernetes-based architecture, the product can be deployed across environments with CI/CD integration. Protegrity says this should support rollout, updates, and scaling for organisations running AI projects in different settings.
The launch comes as businesses reassess conventional perimeter-based cyber defences in response to newer AI systems. Protegrity argues that foundation models such as Mythos challenge those older approaches, particularly when models interact with valuable internal knowledge during inference.
Product scope
According to Protegrity, AI Team Edition extends protection beyond traditional data categories to include operational, behavioural, and contextual information used in AI systems. It also allows policies to be created from natural-language inputs and applied automatically, a feature intended to reduce manual work.
Controls can operate across data pipelines, analytics engines, and inference workflows, adapting to context and role. Interactions are validated, logged, and auditable across systems, with the product aimed at use from individual developers to wider departmental and organisational deployments.
Protegrity is also positioning the offering around cost as well as security. It says the zero model exposure approach can deliver faster time-to-value at a fraction of the cost of typical AI projects, though the announcement did not provide financial details.
Michael Howard, Chief Executive Officer at Protegrity, framed the product around the role of knowledge in AI-driven businesses.
“Knowledge is what forms when facts and transactions meet context. It is when repetitive failure informs judgement, when intent guides interpretation. It is inferential, and lives between systems. Protegrity AI Team Edition is effectively a new type of data and knowledge firewall, a product that has the capabilities required to enable the leading organizations of the world to achieve actual AI success,” said Michael Howard, Chief Executive Officer, Protegrity.
Wider pressure
The launch points to a broader shift in how vendors are addressing AI security. Rather than focusing only on keeping attackers outside the network, companies are increasingly trying to manage what happens when models become part of normal business operations and need access to internal information.
That shift has widened responsibility for AI governance beyond specialist security teams. Businesses must decide how data is handled at each stage of an AI workflow, from ingestion and policy setting to inference and audit trails.
Grace Trinidad, Research Director, Future of Trust, IDC, said that wider organisational involvement is becoming necessary.
“Securing AI has become the responsibility of the entire organization, not just security or IT professionals. Protegrity is providing the tools that enable AI security in a way that makes sense no matter where you are or who you are in the organization,” said Grace Trinidad, Research Director, Future of Trust, IDC.
AI Team Edition is available now.
Click Here For The Original Source.
