0

By Pål Aaserudseter, Security Engineer for Check Point

As many now know, Italy has (temporarily) banned ChatGPT over privacy concerns, and other nations like Germany, France and Ireland are looking into it as well.

RELATED: ChatGPT is the fastest-growing app ever – Top 100 stats and facts 2023 revealed

In short, it is related to GDPR, because of a ChatGPT data leak that happened on March 20th, and the fact that ChatGPT does not age-restrict usage.

To specify, the data leak happened because of a technical error (bug) that exposed chat (topic) history to other users, but also exposed user, billing & partial credit card information of users.

ADVERTISEMENT

In addition, Italy (and other nations) is concerned about uploaded user data possibly being stored in ChatGPT and that others can access their data.

ChatGPT explicitly states that any user data uploaded is only in memory and deleted after usage, UNLESS SPECIFIED TO KEEP THE DATA!

Users and applications uploading data to ChatGPT is responsible for securing that data, not OpenAI/ChatGPT.

ADVERTISEMENT

Another issue for Italy is that OpenAI/ChatGPT does not consider any regulations on age-restricted usage. It is recommended that children under the age of 13 have supervision using online services, but as with anything else regarding ChatGPT, it is up to the user or application leveraging ChatGPT to implement these kinds of restrictions.

So, what do I think?

In my opinion, blocking the likes of ChatGPT will have limited effect.

Sure, companies and schools etc. still block social media platforms like YouTube, Facebook and the likes from their offices due to bandwidth limitations and the potential for distractions (among several other things), but this does not limit the users at home, on their private devices.

ADVERTISEMENT

Even though Italy has blocked ChatGPT, if you’re a bit tech-savvy, it’s easy to circumvent access using a VPN solution for instance.

The issue at hand is that most users are not aware of the risk involved by using the likes of ChatGPT. Sure, it’s a great tool, but uploading information about yourself or others may have a huge and negative impact if this information is suddenly available to anyone. (Think medical data and the likes.)

The potential of data-misuse is by far one of my greatest concerns regarding ChatGPT.

Not just for creating havoc, but what if personal data can be used for targeted marketing purposes (wait, isn’t this already happening?) to an extent that we have never seen before. We’re talking social manipulation on a scale where the buyer persona (i.e., YOU) finds it natural to spend money on whatever the businesses or for that matter criminals put in front of you

But it’s not all bad. The use of AI gives us new possibilities in a number of fields. For instance AI can significantly reduce human errors, reduce risk and taking on tasks by means of a robot instead of a human (like defusing a bomb, going to space, exploring the oceans).

Also, machines won’t get tired and can work endlessly without breaks, doing repetitive tasks without complaint.

We’ll most likely see massive improvements in the medical industry, with AI applications ranging from diagnosis and treatment to drug discovery and clinical trials.

And for Cybersecurity we are already using AI to block and discover new threats, as well as processing massive amounts of security logs that are too vast for any human to process.

Hopefully, the positive effect on Italy banning ChatGPT is that the EU AI Act (that was supposed to be finalized in March this year) will be in effect much faster.

As it is now, and what my interview in CyberTalk points out, there are no rules and regulations in place. It is up to the users/developer’s ethical compass to use ChatGPT in a responsible manner.

We need regulations in place to make sure that OpenAI, Microsoft, Google and others developing AI technologies, makes sure that access, usage, privacy and data is protected and controlled in a secure fashion.

Right now, it’s a loaded gun that anyone can use for whatever purpose they serve.

More in OpEd

You may also like