Global Admissions LogoGlobal Admissions Logo

USD

AI is evolving faster than regulation – Who is really in control of the technology?

Blog Banner Image

Artificial intelligence has moved from a research field to a geopolitical force in just a few years. As AI becomes part of everything from business strategy to national security, a new question emerges: who is really in control of the technology?

The Pentagon calls one AI model “best in class”. Days later, the same technology is described as a potential security risk. When companies such as Anthropic and OpenAI find themselves at the centre of political tensions and government contracts, artificial intelligence becomes more than innovation. It becomes geopolitics.

AI outpaces the law

The EU AI Act, the European Union’s first comprehensive regulatory framework for artificial intelligence, entered into force in August 2024 and will be gradually implemented through 2026–2027. At the same time, the adoption of AI is accelerating rapidly across both the public and private sectors.

- AI is being embraced faster than we are able to regulate it, says Dr. Piet Delport, Associate Professor and Programme Lead of Digital Assurance & Security Management at Noroff University College.

The question is therefore no longer whether we should use AI – but how we should govern it.

When technology changes the rules

Artificial intelligence is already influencing everything from healthcare and finance to defence and intelligence operations. According to the McKinsey Global Survey on AI, around 70 percent of organisations worldwide have adopted AI in at least one business function. The rapid growth of adoption illustrates how quickly the technology is becoming embedded in core processes across industries.

At the same time, cyber security communities report a growing number of vulnerabilities related to AI systems, including weaknesses in training data, model manipulation, and the misuse of AI tools in cyberattacks.

AI does not only represent efficiency gains. It also introduces new attack surfaces, new dependencies, and new regulatory challenges. In this landscape, the need for clear governance, risk awareness, and responsible oversight becomes increasingly important.

What is AI governance – and why does it matter?

When organisations implement artificial intelligence, the challenge is not only technological. It is also about how organisations manage risk, responsibility, and compliance.

- The primary purpose of governance is to create a stable foundation for safe and manageable growth, Delport explains.

The challenge is that technology often evolves faster than organisational control mechanisms.

- When AI is introduced outside established governance structures, we risk undermining the very stability we aim to create.

This is why many organisations are now working more systematically with Governance, Risk & Compliance (GRC) frameworks related to artificial intelligence.

The students who will govern the future

Although AI technology evolves rapidly, the Digital Assurance & Security Management bachelor programme is built on principles that remain relevant even as technology changes.

Students learn how organisations can establish effective governance structures, conduct risk assessments, and navigate increasingly complex regulatory environments.

Through realistic case studies, they analyse scenarios where new technologies introduce new types of risk.

- We had to ask questions that leadership might not have considered. It felt very realistic, and a bit intimidating, one student explains after completing such a case project.

According to Delport, that is exactly the point.

- We train students to become the generation that will design regulatory frameworks, conduct risk assessments, and ensure responsible use of AI. They need to be able to pause and ask: how does this change our risk landscape?

AI still needs human judgement

Even though artificial intelligence can automate analysis and decision processes, it cannot take ethical responsibility. It cannot independently interpret regulatory grey areas.

It cannot assess long-term societal consequences. And it cannot take accountability if something goes wrong.

This is where security leadership becomes essential.

Demand for expertise in information security, risk management, and compliance is growing as AI becomes embedded across industries. Roles such as Risk Manager, Compliance Officer, Security Manager, and Chief Information Security Officer (CISO) are becoming increasingly critical for organisations seeking to combine innovation with responsible oversight.

The future is governed – not just coded

As artificial intelligence becomes integrated into everything from business strategy to military intelligence, the key question is no longer only how the technology works.

It is also about who defines the rules.

Understanding AI therefore requires more than technical knowledge. It requires insight into risk, regulation, accountability, and societal impact.

In the years ahead, artificial intelligence will shape everything from corporate governance to national security. The real question is not only who develops the technology – but who governs how it is used.

Learn more about the Digital Assurance & Security Management programme

This article was originally published by Noroff.


Noroff University College

Noroff Logo
Noroff University College Logo

Noroff University College offers a unique and specialized experience for students seeking a career in tech. Noroff has helped students jumpstart their careers in various tech-related fields, so whether you’re into cybersecurity, forensics, software development, or animation, Noroff has something for you. Noroff has flexible study options to suit every busy student.

Learn more and apply for Noroff's programs through the Global Admissions platform.

Noroff University College
Bachelor in Cyber Security
Noroff University College
Kristiansand, Norway
English
Next Start Date Aug 2026
Duration 3 years
Yearly Tuition 11865.23 NOK
8.2
(8 reviews)
Deadline
Aug 2026
Noroff University College
Assurance in Cybersecurity Management (Online PLUS Oslo/Bergen) (Part-time)
Noroff University College
Online
English
Next Start Date Aug 2026
Duration 2 years
Yearly Tuition 5262.81 NOK
8.2
(8 reviews)
Deadline
Aug 2026
Noroff University College
Digital Marketing (Full-time) (Online)
Noroff University College
Online
English
Next Start Date Sep 2026
Duration 1 year
Yearly Tuition 8645.28 NOK
8.2
(8 reviews)
Deadline
Aug 2026
Noroff University College
Back-end Development (4 year) (Part-time) (Online)
Noroff University College
Online
English
Next Start Date Sep 2026
Duration 4 years
Yearly Tuition 4322.64 NOK
8.2
(8 reviews)
Deadline
Aug 2026
Noroff University College
Back-end Development (1 Year) (Full-time) (Online)
Noroff University College
Online
English
Next Start Date Sep 2026
Duration 1 year
Yearly Tuition 8645.28 NOK
8.2
(8 reviews)
Deadline
Aug 2026
Noroff University College
Front-end Development (Full-time)
Noroff University College
Online
English
Next Start Date Sep 2026
Duration 2 years
Yearly Tuition 8645.28 NOK
8.2
(8 reviews)
Deadline
Aug 2026
Noroff University College
Back-end Development (2 Year) (Part-time) (Online)
Noroff University College
Online
English
Next Start Date Sep 2026
Duration 2 years
Yearly Tuition 4322.64 NOK
8.2
(8 reviews)
Deadline
Aug 2026
Noroff University College
3D Art and Games Technology - content creation (Part-time) (Online)
Noroff University College
Online
English
Next Start Date Sep 2026
Duration 4 years
Yearly Tuition 4322.64 NOK
8.2
(8 reviews)
Deadline
Aug 2026

Mimia

Reading this article? I can help you find related programs or answer questions.

Blog