One of the big challenges for people at the moment, certainly CISOs, is what they do about Gen AI. Do they allow it? Do they not allow it? Do they allow access to public Gen AI? Do they build private Gen AI? It has become a complicated discussion across enterprises, causing many CISOs to seek advice.

To get a view of how one company, Appian, is approaching this, I spoke with Andrew Cunje, CISO at Appian. Cunje has a history of securing complex enterprise environments in both the public and private sectors. In this podcast, he talks about his perspectives on securing large language models (LLMs) and generative AI.

Cunje talked about the challenge that gen AI and LLMs bring in terms of data security and privacy. It is something that is key to where Appian is going. It focuses on strict access controls and encryption to securely manage customers’ sensitive data within dedicated instances.

According to Cunje, “We want to make sure from a policy perspective, but also from an implementation perspective, that their data is only going to transit over dedicated circuits.”

Compliance is a challenge that companies have to deal with every day. Cunje believes that compliance is another area where AI can play an important role. However, there is still a need for careful oversight. He commented, “AI can help with compliance controls but not replace human expertise. I think it’s an AI plus human scenario.”

The conversation also turned to the challenge of transparency. Because customers are building from the data they hold, they retain control and understanding of how their data is used.

Where can I get it?

You can listen to the podcast by clicking on the player below. Alternatively, click on any of the podcast services below and go to the Enterprise Times podcast page.

Enterprise Times on Spotify (Image Credit: Spotify)

Enterprise Times on Soundcloud (Image Credit: Soundcloud)



Enterprise Times on Podbean (image credit: Podbean)

Enterprise Times on Apple Podcast (image credit: Apple Podcasts)Enterprise Times on Amazon (image credit: Amazon)



Please enter your comment!
Please enter your name here