ARIA.

AI Readiness Intelligence Assessment


ARIA

AI Readiness Intelligence Assessment

ARIA by Bid Savvy  /   Regulatory Landscape

EU AI Act, PPN 017
and UK GDPR for UK bid teams.

A plain-English regulatory landscape for UK public sector bid teams in 2026. What applies. When. And what to actually do about it.

Executive Summary

UK public sector bid teams now operate inside three regulatory regimes that all touch how AI is used in bid preparation: the EU AI Act, UK Procurement Policy Note 017, and UK GDPR. Each was written for different reasons, applies on different terms, and creates different obligations. None of them treats bid functions as a special case, but bid teams are particularly exposed because they handle confidential tender documents, personal data and time-pressured drafting through tools the wider organisation has rarely governed.

This page sets out, in plain English, what each regime actually requires of UK bid teams in 2026, when the obligations apply, where they intersect, and what the proposed Digital Omnibus amendments may change. It is written for Heads of Bids, Bid Directors, Commercial Directors, AI Governance Leads and Named Reviewers who need a credible reference document, not a legal opinion.

The headline position is straightforward. The EU AI Act has been in force since 1 August 2024 and applies to UK organisations with EU market exposure. PPN 017 has been live since 24 February 2025 and is appearing in UK central government tenders. UK GDPR has applied throughout and is breached every time a CV or named person flows into a public AI tool without a lawful basis. ARIA Diagnostic is calibrated against all three, and ARIA Compass is the training programme that supports your team’s literacy and capability response.

The 2026 regulatory snapshot for UK bid teams

EU AI Act in force1 August 2024
Article 4 AI literacy applies2 February 2025
GPAI obligations apply2 August 2025
PPN 017 in effect24 February 2025
EU AI Act full applicability2 August 2026 (proposed postponement under Digital Omnibus)
UK GDPRIn force throughout

The EU AI Act for UK bid teams

The EU Artificial Intelligence Act (Regulation 2024/1689) is the world’s first comprehensive horizontal AI law. It entered into force across the EU on 1 August 2024, with obligations phased in over the following four years. The Act is risk-based: it categorises AI systems by the risk they pose and applies obligations proportionate to that risk.

UK organisations are not automatically out of scope. The Act applies extraterritorially to providers placing AI systems on the EU market and to deployers whose use of AI affects people in the EU. UK public sector suppliers bidding for work that touches EU buyers, EU end-users, or EU-based subcontracting chains can fall within scope. The right approach is to assume scope unless a specific assessment confirms otherwise.

Article 4: AI literacy

Article 4 has applied since 2 February 2025. It requires that staff of providers and deployers using AI tools have a level of AI literacy proportionate to their role, the context of use, and the people affected. The text of Article 4 is short. The implications are not. For bid teams, this means every person using AI tools professionally in bid preparation needs literacy that recognises the legal environment, the data they are handling, and the failure modes of the tools they are using.

Article 4 is a deployer obligation. The deployer is the organisation using the AI system, not the AI provider. For most UK public sector suppliers, Article 4 lands directly on the bid function because the bid function is where AI usage in the organisation has scaled fastest. Generic AI awareness training does not satisfy Article 4 in a bid context because it does not address the specific scenarios that create the most exposure.

"Article 4 is the obligation that lands fastest, applies most broadly, and is most often misunderstood as something the IT department handles."

Article 26: Deployer obligations

Article 26 sets out the obligations that fall on deployers of high-risk AI systems. As legislated, Article 26 applies fully from 2 August 2026, although the proposed Digital Omnibus would postpone application of high-risk obligations for stand-alone systems (Annex III) to 2 December 2027 and for embedded systems (Annex I) to 2 August 2028. Until the postponement is formally adopted, organisations should plan against the legislated 2 August 2026 date.

Article 26 contains eight obligations. The four that matter most for bid teams using high-risk AI systems are:

  • Use the system in line with the provider’s instructions for use. Stepping outside the intended purpose voids the conformity assessment the provider has built around the system.
  • Assign human oversight to natural persons with the necessary competence, training, and authority. This is the obligation that creates the Named Reviewer accountability gap in many bid functions.
  • Ensure input data is relevant and sufficiently representative for the system’s intended purpose, where the deployer controls input data.
  • Monitor operation, keep logs for at least six months, and inform the provider of any serious incidents or risks that emerge.

The remaining four cover data protection impact assessments where required, transparency to natural persons affected, cooperation with competent authorities, and (for public authorities and certain regulated entities) registering use of the system in the EU database. None of these obligations is technically complex. All of them require operational discipline that most bid functions today cannot demonstrate.

Article 50: Transparency obligations

Article 50 covers transparency obligations on providers and deployers of certain AI systems, with key obligations applying from 2 August 2026. For deployers, the most relevant obligation is the requirement to disclose AI-generated text published with the purpose of informing the public on matters of public interest. For bid teams, the practical implication is that any AI-assisted content used in publicly visible contexts (case studies, press materials, supplier-facing communications) needs a disclosure protocol.

General-purpose AI (GPAI) and the Code of Practice

Obligations on providers of general-purpose AI models began applying on 2 August 2025. The General-Purpose AI Code of Practice is a voluntary compliance instrument, drafted by independent experts under the European Commission, that gives providers a structured route to demonstrating compliance with the AI Act’s GPAI obligations. While the Code is provider-facing, deployers using GPAI models (which includes any organisation using ChatGPT, Microsoft Copilot, Gemini and similar tools) inherit downstream documentation expectations. Bid teams should be in a position to demonstrate that their GPAI use is structured around a provider that has committed to or signed the Code, or that they have taken equivalent due diligence.

Penalties under Article 99

Penalty tierMaximum penaltyApplies to
Prohibited AI practices (Article 5)€35m or 7% global turnoverUse of AI for prohibited purposes (e.g. social scoring, certain biometric categorisation)
High-risk system non-compliance€15m or 3% global turnoverBreach of obligations on high-risk AI systems, including Article 26 deployer obligations
Information offences€7.5m or 1.5% global turnoverSupplying incorrect, incomplete or misleading information to authorities or notified bodies

UK PPN 017: AI use in procurement

Procurement Policy Note 017 (PPN 017) sets the UK government position on AI use in procurement and tendering. Issued by the Cabinet Office in February 2025, it replaced the earlier PPN 02/24 and applies for procurements commencing on or after 24 February 2025. It applies to all central government departments, their executive agencies, and non-departmental public bodies. Other contracting authorities, including local government and the wider public sector, may apply the same approach.

PPN 017 does not prohibit suppliers from using AI in bid preparation. It explicitly recognises the benefits suppliers gain from using AI to bid for more contracts. The Note’s focus is transparency: contracting authorities should be able to understand whether and how a supplier has used AI when responding to a tender, and what controls are in place around that use.

What PPN 017 actually does

The Note enables contracting authorities to add disclosure questions to the Invitation to Tender requiring suppliers to declare AI use in their tender response. Three example disclosure questions are provided in Annex B. The questions are non-scored: contracting authorities are not permitted to mark a supplier down for using AI, and equally are not permitted to mark a supplier up for not using it. The information is for due diligence and risk management.

The strategic implication for suppliers is the disclosure response itself. A supplier who answers the disclosure question with a single sentence signals one level of governance maturity. A supplier who can describe their AI Usage Policy, the validation process applied to AI-assisted content, the data classification rules that govern what may be input to AI tools, and the Named Reviewer accountability for the final submission, signals a fundamentally different level. PPN 017 does not score the answer, but the answer is read by humans who form a view of the supplier.

Beyond the standard disclosure

PPN 017 also recognises that AI is increasingly used in the delivery of services that are not themselves “AI services”. Where this is likely, contracting authorities are encouraged to require suppliers to declare it. Bid teams should expect this expectation to broaden over time, with disclosure questions evolving from “did you use AI to write this bid” to “how is AI used across the proposed service delivery, and what governance applies”.

UK GDPR: the regime that is breached daily

UK GDPR is the regime UK bid teams are most likely to be breaching today, often without knowing it. The breach is not exotic. It is routine. CVs of named personnel proposed for a bid, references from past clients, named subject matter experts, case studies featuring identifiable individuals: all of these are personal data within the meaning of UK GDPR. When that personal data flows into a public AI tool to be summarised, restructured or rewritten, several UK GDPR provisions are immediately engaged.

Article 6: Lawful basis

Processing personal data through an AI tool requires a lawful basis under UK GDPR Article 6. Most bid teams default to “legitimate interests” without conducting the legitimate interests assessment, which is itself a breach of the documented decision-making expectation. Where consent is the lawful basis, that consent must be specific, informed and freely given; a CV provided for a job application does not include consent to be paraphrased by ChatGPT for a tender response.

Article 22: Automated decision-making

UK GDPR Article 22 places safeguards on solely automated decision-making that has legal or similarly significant effects on individuals. AI use in bid preparation rarely meets the “solely automated” threshold, but the safeguards become relevant the moment the bid team uses AI to make staffing recommendations, to score CVs, or to evaluate which named personnel should be put forward. The line between AI-assisted human decisions and AI-driven decisions is operational, not theoretical.

Article 35: DPIAs

UK GDPR Article 35 requires a Data Protection Impact Assessment where processing is likely to result in high risk to individuals. Bid teams using AI to process personal data at scale, particularly where the AI provider is outside the UK and the data flows internationally, will frequently meet this threshold. The DPIA needs to be conducted before processing begins, not retrospectively.

Penalties

UK GDPR enforcement penalties run to £17.5m or 4% of global turnover for the most serious breaches, applied by the Information Commissioner’s Office. The likelihood of a public AI tool’s use in bid preparation triggering an ICO enforcement action is low. The likelihood of it triggering a buyer’s data protection due diligence question, particularly under PPN 017’s broader expectations, is rising rapidly.

The proposed Digital Omnibus

The European Commission proposed Digital Omnibus amendments on AI on 19 November 2025. The Omnibus is a comprehensive package of changes to the EU AI Act and surrounding legislation, intended to clarify the Act’s interaction with other laws, support innovation, and adjust phased application timelines. As of late April 2026, the European Parliament adopted its negotiating position on 26 March 2026 and the Council adopted its position on 13 March 2026. Trilogue negotiations between the Parliament, Council and Commission are now underway, with a political objective of reaching agreement by spring 2026.

The amendments most relevant to UK bid teams concern application timing for high-risk AI systems. Both the Council and Parliament negotiating positions converge on 2 December 2027 for stand-alone Annex III high-risk systems and 2 August 2028 for embedded Annex I systems. If adopted, these dates would replace the currently legislated 2 August 2026 full applicability date for these specific provisions. Article 4 (AI literacy) obligations would also be modified, with the Commission’s proposal shifting some elements toward Member State responsibility for promoting literacy rather than enforcing organisation-level training requirements.

Two operational positions follow from this. First, the postponement is not yet law and may not be adopted in its current form. Organisations that delay their compliance work expecting it to land are exposing themselves to a position where the postponement does not happen and they are out of time. Second, the underlying expectation that organisations train and evidence AI competence is durable. Whether the legal hook is Article 4 in its current form, an amended version, Article 26 deployer obligations, PPN 017 disclosure expectations, or buyer-side governance scrutiny, the question every UK public sector supplier will need to answer is the same: can you demonstrate that AI use in your bid function is governed?

"Plan against the legislated dates. Do not plan against the postponement that may not happen. The cost of being early is small. The cost of being late is the contract."

What UK bid teams should do in 2026

Reading the regulation is not the work. The work is translating regulatory obligations into operational discipline inside a bid function that is already running at capacity. Five priorities should sit at the top of every UK public sector supplier’s bid governance agenda this year:

1. Confirm scope
Establish, in writing, whether your organisation falls within scope of the EU AI Act, what tier of obligations applies, and where PPN 017 applies to your live and forecast pipeline. Do not assume out-of-scope; document the assessment.

2. Build the AI literacy evidence trail

Article 4 has been live for over a year. Most bid teams cannot evidence training, even if their people have informally absorbed AI knowledge. Structured, role-specific, CPD-accredited training with documented completion records is the standard regulators and buyers will look for. ARIA Compass Foundation and Practitioner are calibrated for exactly this.

3. Close the Named Reviewer accountability gap

Identify, by name, the person who holds Article 14 and Article 50 editorial responsibility for every AI-assisted submission. Verify that they have sufficient competence, training and authority to exercise oversight. Document this. The biggest gap most bid functions have today is between nominal sign-off and genuine review.

4. Get UK GDPR right at the prompt level

Establish data classification rules that govern what bid team members may and may not put into AI tools. Ensure that any AI provider processing personal data has a data processing agreement in place, a legitimate basis under Article 6 documented, and where threshold is met, a DPIA conducted before deployment.
5. Prepare a credible PPN 017 disclosure response Draft your response to the standard PPN 017 disclosure questions now, before they appear in a live ITT. The exercise of drafting will surface the gaps in your governance, while the risk is still low. ARIA Pulse and ARIA Diagnostic are the independent route to that drafting exercise; an internal-only attempt rarely surfaces what an external assessment does.

Common regulatory questions

What bid leaders ask first.

Answers calibrated for UK public sector suppliers in 2026. Not legal advice; for that, speak to your in-house counsel or external solicitor. For the operational response, speak to us.

Q. Does the EU AI Act apply to UK organisations?

Yes, where there is EU market exposure. The EU AI Act applies extraterritorially to providers placing AI systems on the EU market and to deployers whose use of AI affects people in the EU. UK public sector suppliers bidding for work that touches EU buyers, EU end-users, or EU-based subcontracting chains can fall within scope. The right approach is to assume scope unless a specific assessment confirms otherwise.

Q. What is EU AI Act Article 4 and does it apply to bid teams?

Article 4 of the EU AI Act requires that staff of providers and deployers using AI tools have sufficient AI literacy. It has applied since 2 February 2025. For bid teams, this means every person using AI tools professionally in bid preparation needs literacy proportionate to their role. Note that the proposed Digital Omnibus amendments may modify the framing of Article 4 obligations; the underlying expectation that organisations train and evidence AI competence remains.

Q. What is PPN 017 and what does it require of bid teams?

PPN 017 (Procurement Policy Note 017, replacing PPN 02/24) sets the UK government position on AI use in procurement. It applies to central government departments, executive agencies and non-departmental public bodies for procurements commencing on or after 24 February 2025. It enables contracting authorities to ask suppliers to disclose AI use in tender preparation. Organisations that cannot answer the disclosure question credibly signal governance immaturity to evaluators.

Q. When do EU AI Act high-risk and Article 26 deployer obligations apply?

The currently legislated date for full applicability of high-risk obligations and most remaining provisions is 2 August 2026. The European Commission has proposed Digital Omnibus amendments that would postpone application for stand-alone high-risk systems (Annex III) to 2 December 2027 and for embedded high-risk systems (Annex I) to 2 August 2028. The Council and Parliament adopted negotiating positions in March 2026; trilogue negotiations are ongoing. Organisations should plan against the legislated 2 August 2026 date unless and until the postponement is formally adopted.

Q. How does UK GDPR apply to AI use in bid preparation?

CVs, references, named personnel and case study material are personal data. Processing them through public AI tools without a lawful basis under Article 6 or without an appropriate data processing agreement with the AI provider is a UK GDPR breach. Where AI-assisted decisions affect individuals significantly, Article 22 safeguards on automated decision-making may apply. Article 35 may trigger a Data Protection Impact Assessment. ICO enforcement penalties run to £17.5m or 4% of global turnover.

Q. What are the EU AI Act penalty levels?

Article 99 of the EU AI Act sets three penalty tiers: up to €35m or 7% of global turnover for prohibited AI practices (Article 5); up to €15m or 3% for non-compliance with high-risk system obligations; up to €7.5m or 1.5% for supplying incorrect, incomplete or misleading information to authorities. Penalties on SMEs may be the lower of these amounts. National competent authorities apply penalties in accordance with national law.

This guide is published by

UK bid management consultancy. APMP-certified leadership. 25+ years of public sector bid experience. Trading as Bid Savvy.

SMARTER BIDS. REAL RESULTS.

About this regulatory guide, by Bid Savvy

Written from inside the
bid function , not from outside the law.

Most regulatory writing on the EU AI Act and PPN 017 is written by lawyers, AI consultancies, or compliance generalists. This guide is different. It is written by Bid Savvy, a UK bid management consultancy with over two decades of public sector bid leadership, calibrated for the people actually writing, reviewing and signing off bids under deadline pressure.

This is not legal advice. For legal advice, speak to your in-house counsel or external solicitor. What we publish is the bid leadership translation of regulatory obligations into operational discipline that survives a Tuesday afternoon at deadline week. If that is the perspective you need, ARIA Pulse, ARIA Diagnostic and ARIA Compass are the engagements that turn it into outcomes.

From regulation to operational discipline

Discuss ARIA for your bid
function.

A discovery call confirms which combination of ARIA Pulse, ARIA Diagnostic and ARIA Compass is the right fit for your organisation, your timing, and your buyer landscape. No proposal, no obligation.