Context

Regulations on AI systems — EU AI Act, NIS2, Medical Device Regulation — define high-level obligations: transparency, human oversight, risk management, documentation. They do not define which measurable properties a system must exhibit to be considered compliant. This gap leaves operators the job of translating regulatory requirements into verifiable controls, a step that today is solved ad-hoc and rarely in a reproducible way.

OISG (Open, Intelligent, Secure, Governed) proposes an adequacy vocabulary that makes this translation explicit. It is not a compliance framework and not a consortium: it is a conceptual model, accompanied by measurable criteria, designed to be applied on top of existing standards.

The four pillars

Open — Components that influence system decisions must be inspectable, reproducible and interoperable. Adequacy metric: the fraction of relevant components that can be audited by independent third parties without proprietary access.

Intelligent — System capabilities must be measured, documented, explicitly bounded and explainable. Metric: the ability to produce complete explanations of a response (data sources, model version, confidence level) within a defined latency.

Secure — The system must be resilient to adversarial manipulation at runtime. Metrics: mean time to detection, containment and forensic recovery if an agent is compromised.

Governed — Compliance must be verifiable automatically, with immutable evidence. Metric: hours required to produce compliance evidence for a supervisory authority.

The interdependence loop

The four pillars are not independent dimensions: they form a loop. A system that is not Open cannot be independently measured as Intelligent. A system without Intelligence metrics cannot define its own Security perimeter. Without Security there is no stable Governance evidence. Without Governance there are no structural incentives towards openness. Evaluating each pillar is only meaningful if the other three are adequately defined.

Mapping onto existing standards

OISG does not replace regulations and reference frameworks: it indexes them. The 1.0 paper includes an explicit mapping onto EU AI Act, NIST AI RMF, ISO/IEC 42001 and the OWASP Top 10 for Agentic Applications, so that each adequacy criterion can be traced back to one or more regulatory requirements. This reduces duplication across audit processes and allows collected evidence to be reused for multiple frameworks at once.

Adequacy Test

The oisg.ai site includes an evaluation wizard: twenty criteria (five per pillar) produce an adequacy score and a downloadable certificate. It is a self-diagnosis tool for teams designing or operating autonomous AI systems, and serves as the starting point for a structured audit, not a replacement for one.

Availability

The 1.0 paper is published on 16 April 2026 under CC BY 4.0 and is available at oisg.ai/paper. The first adopter is noze, which has integrated OISG criteria into its AI governance products and services. The paradigm is open: external adoption, peer review and contributions are welcome.