AI’s Role in Data Security and the AI Regulatory Environment
Thomas Jul, CEO of Inpay, Carey Kolaja, CEO of Versapay, and Ronen Cohen, VP of Strategy at Duality, join Jill Malandrino on Nasdaq TradeTalks to discuss AI’s role in data security and the data privacy and AI regulatory environment.
00:08Welcome to NASAC Trae Talks,
00:11where we meet with the top thought leaders and strategists in TradFi Digital Assets,
00:14technology, and Financial Planning.
00:16I'm your host Jill Malingerino,
00:17and joining me on the desk at the NASDAQ Market Site is Thomas Jewell,
00:20CEO of IPA, Carrie Collegia,
00:22CEO Versap and Ronan Cohen,
00:24vice president of Strategy duality.
00:26We're here to discuss AI's role in data security and
00:29Data Privacy and the AI regulatory environment.
00:32It is great to have everyone with us covering my favorite subject on Trade Tox.
00:36Let's go around the horn quickly, Thomas. We'll start with you.
00:38Explain to us where IP fits within the Data Privacy and AI ecosystem.
00:42Yeah, thanks, ill. Thanks for having us.
00:44So InPay is a cross border payments company.
00:47We do cross border payments all over the world.
00:49So we connect different different senders and receivers.
00:52We try to do that safe,
00:54fast and secure, which I think is part of what we are here to talk about today.
00:58And for us, it's about technology,
01:00it's about network, and it's about compliance.
01:02And we built that for 16 years,
01:03and we believe that we are the leaders in that technology.
01:05All right. And are versip?
01:06Sure. So ersaps in the business of offering accounts receivable efficiency suites.
01:12Ultimately, what that means is we try to make it easier for
01:14businesses to get paid for the goods and services that they provide.
01:17And we do that through the process of automating invoicing,
01:21moving b2b payments, and as well as reconciling cash invoices, using AI itself.
01:26So you can imagine with every touch point when money moves,
01:29data moves, and therefore,
01:31security compliance and regulation is critical.
01:33Alright. And duality. Thanks for having me.
01:36Duality is a secure data and AI collaboration platform.
01:39We're used in the public and private sector by organizations that want to
01:43jointly analyze very sensitive data and derive outcomes and insights with each other,
01:47and it's used by collaborating partners and sometimes even competitors to do that.
01:51Yeah. Well, it's so interesting Rod in because you hear competitors,
01:54consumers, you know, your own tech stack and so forth.
01:56And then AI, it's used for really,
01:58really good things, and it's also employed by bad actors.
02:01So what is its role in data security?
02:04It plays a huge role, and data security plays a huge role in AI as an enler to AI.
02:09You know, privacy and security are really the main blockers in terms of leveraging data.
02:13I think what you'll see is that AI application providers,
02:17the way that they're going to have to differentiate themselves is how they
02:20can access sensitive data to make their models better,
02:23to work more closely with their customers.
02:25The big challenges there are security and privacy.
02:27Right. And especially, you know,
02:28when you're thinking about cross border payments and so forth.
02:31And from what I understand you operate in virtually any place
02:33around the planet where you're able to.
02:35Yeah. That's correct. Yeah. So I can go totally back with Ron is saying he,
02:39because it's all about how do you as an industry?
02:42This is not about in pay and not about specific companies.
02:44How do you as an industry build a safe environment in
02:46which the good guys can transact and when the bad guys are kept out?
02:49And I think what you're saying that's exactly what we need to do we need
02:52to think about it as how do we competitors, partners,
02:54whatever we call ourselves,
02:56actually act together to secure that
02:58the payments you deliver are actually done to the right people.
03:01Yeah, I would add to that.
03:02I think, you know, we've really shifted in our society and in the commercial ecosystem,
03:07too, it's not about how do I win?
03:10It's about how do we win,
03:11and the world is moving so fast and technology is moving so fast,
03:14like, you can't keep up Jail anymore with all aspects of your business.
03:17And so if I can work with a competitor or a partner
03:20to share data signals that are done anonymously with
03:23keeping the consumer privacy regulation in mind to help
03:27service them better and to help protect against bad behaviors and efarious activities.
03:32Like, I think that's something we would all want to opt into.
03:34And but finding those right platforms are
03:36difficult to And that's essentially what duality does.
03:39You sit in the middle of the data custodians, the compliance teams,
03:42and then being able to build these applications leveraging AI.
03:45Exactly. Absolutely. And you know, there's a dilemma,
03:48and there's always a trade off because it's a we versus me like aryan Thomas brought up.
03:53But at the same time, you still have to protect yourself,
03:55you have your IP, you have your sensitive data,
03:57you have your proprietary data.
03:58How can you balance that?
04:00And that's where privacy enhancing technologies come in and secure collaboration in AI
04:05comes in to be an enler
04:07to these outcomes and to these insights and to keep in your world,
04:10bad guys out and the good guys in.
04:12But how do you do that, like, with who owns the IP of the model that's created?
04:16So say we all kind of opt into this and we want to make our businesses better,
04:19and we want to make our businesses safer.
04:21You have to build a model with some logic that makes
04:23sense as to what solving for. So who owns that?
04:26It depends on the situation, of course.
04:29But, you know, the way that we address it is there's a suite of privacy technologies,
04:33there's the analytics, but then there's the technically enforced governance.
04:36And so in any given collaboration,
04:38the different parties come together and they say, this is the outcome we want.
04:40This is how we want it, and then technically enforcing it is really the rub here, right?
04:45Because otherwise, how are you really protecting yourself?
04:48Right. Right. Well, I mean, that really is the question then when you think about
04:51data privacy and who's regulating this AI environment, Who is that?
04:55Is it, you know, we know that U has some framework.
04:58We're still working on that here in the US,
04:59and, of course, there's this proliferation because,
05:01you know, each individual state has its own set of standards.
05:05So is it almost like an SRO,
05:06like a self regulatory organization thing and just doing the right thing as an industry?
05:10Or where is this framework? Where does it exist?
05:13Well, for us, I mean, being in cross border payments all over the world,
05:16we basically have to go into every jurisdiction that applies to a payment, right?
05:19So, being able to open a corridor takes a long time,
05:21we have to investigate, how does it work?
05:23What other things we need to do?
05:24If you wanted me to do, you know,
05:26a payment for you in let's just say Australia,
05:28I would have to understand what's happened in Australia,
05:30what happens in the US? What happens on the way?
05:32But then once you have that,
05:34then you would basically have to follow what
05:35happens in jurisdictions as things go along, right?
05:38So it's a lot of work that goes before so that when you need the payment,
05:41which hopefully is near instant or instant in the best case.
05:44Then you're already ready, right?
05:46And that's the work where Karrie and I would spend a lot of time.
05:49But To say it's easier.
05:52This is a very difficult,
05:53complex problem we're all trying to work through.
05:56But in the EU where there is a regulatory framework where
05:58everybody abides by one thing in the US, like,
06:02we're managed at the state level,
06:04and different types of data,
06:05whether it's biometric data or whether it's PII data are managed differently.
06:09And so in those types of scenarios,
06:11even with some of the regulation on the hill,
06:13It is like this one, two, three layers of defense.
06:15Like, it's what am I doing to protect
06:17my own customers and the businesses that I'm working with?
06:20What are the people who are working with me doing?
06:23And do I trust them to your point of how do you do a KIB on them?
06:26And then do I trust the people I'm actually maybe
06:28giving information to in the process of payments.
06:31It's card processors or banks, et cetera.
06:34And you have to self govern at some degree until we do have a framework,
06:38I think, a global framework that makes more sense of this.
06:40And that's part of the challenge, too.
06:42When you self govern or self censor,
06:44you're actually reducing the value that you're getting out of the AI,
06:47you're censoring what inputs you're putting in,
06:50which censors what you're getting out and the value that you're getting from it.
06:53So really, you know,
06:55we see organizations doing this all the time.
06:57And so, obviously, there's regulation that you need to comply with.
06:59But everyone's kind of scared because you don't really understand
07:02where AI's going and where the data's coming from and how the data is collected are used.
07:06And so that's really where you need to protect yourself if you were.
07:10And, I mean, I think for a versa pay customer,
07:11to actually be able to trust you,
07:13you have to trust me if I was the one,
07:15you know, doing the for you.
07:16So we would have to spend a lot of time as
07:18an industry together to actually validate each other.
07:21And then when we're ready, then we build the environment in which we can
07:24then go ahead and that takes a lot of work. It does.
07:27I is in there. Then, of course,
07:29when the transaction happens there's millions of transactions all the time, right?
07:32Right. They are also screened,
07:34but that's a totally different So, Ron, let me ask you this.
07:36I mean, companies they're just rushing to spend billions of dollars on AI,
07:39and it's almost like, is there even a problem that needs to be solved?
07:42But the question is, you want to be first to go to the market?
07:44You want to establish revenue lines with this new technology,
07:47but then you also have to balance out the risk management side of it, too,
07:50when it comes to open access to your data or you know,
07:54never mind, you know, the cybersecurity implications around it, too.
07:57So how do you measure that potential return relative to the risk that exists?
08:01Well, to the point of what we were just discussing,
08:04I think the return is going to be limited if you're limiting yourself. Mm hm, right?
08:07And so I think that you'll see that the way that AI application vendors are going to
08:11differentiate themselves is how they can actually access and use sensitive data,
08:16proprietary data, and collaborate with their ecosystem with their partners.
08:19That's really the way to unlock data and unlock value.
08:22And so I think what you'll see is just like, you know,
08:25RSA unlocked Ecommerce, the world of
08:27Ecommerce by protecting credit card transactions online.
08:30You'll see privacy enhancing technologies unlock
08:33the world of AI and unlock the ROI that comes with it.
08:36Ronan, I do want to touch on something Jill asked about,
08:38are we solving a problem?
08:39Because I do think that all of us who have been in business,
08:42when J AI came to the mainstream,
08:45and it was part of everyday conversation.
08:47There was this rush to talk about it or
08:49put it on the end of what your value proposition is.
08:52And I think we're in a world now where you
08:55have to be using data to create value for yourself,
08:58for your customers, for your shareholders.
09:00At the same time, it's not going to solve every problem.
09:03Like, I look at the world that we're in right now where
09:05business business transactions, particularly in the field.
09:08It's an underserved market.
09:10They're still using legacy systems.
09:1133% of it is still managed by check, you know?
09:14And even though we're doing billions of transactions, you know, at the end of the day,
09:18you know, I look at is AI really going to solve
09:20the problem that this office of the CFO needs?
09:22Not necessarily, maybe over time, more sophisticatedly,
09:26but I do think that's an important point is,
09:28is it solving the problem or is it in search of a problem?
09:30And maybe we're just not ready in some areas.
09:33But I think that again,
09:34back to your point about the U and what happened there already,
09:37and with the whole data owner and the data data?
09:40That's the question. Exactly. But the data processor and the data owner.
09:42I mean, that is exactly what's defined in that law, right?
09:44Right. The point being that once you go digital,
09:46it's a lot easier to handle who actually can keep what at what point in time, right?
09:50Whereas the check, I think there's very few areas where
09:53the US technology wise is not ahead of Europe, but exactly in that area,
09:57actually the European economy is because a lot of it is actually digitized already,
10:01and that makes sure that we can take the data,
10:02put it into your model, use it the way we should,
10:05but protect it and actually remove it again.
10:06I think that's a very important Well,
10:09the EU looks at it as the consumer the data.
10:12I mean, it's a consumer driven act.
10:14That's that's sort of the That's the paradigm that the U laws. Yeah, I don't.
10:19Because when we look at President Biden's executive order, you had noted that,
10:22it seems as if it's more about the government and what it is allowed to do as it
10:27relates to privacy protection and combating other let's call it national security risks,
10:33but it doesn't appear that it's necessarily geared towards companies or towards people.
10:38It's more of a government action. Almost like the First Amendment.
10:41Yeah, you know, the executive order that you're
10:43referencing really is designed to protect the First Amendment,
10:45protect data privacy, and that starts with government.
10:48But, you know, there are federal AI acts, federal privacy act,
10:51and even state level laws coming through that will govern this,
10:56you know, on a consumer basis or even an industry basis,
10:58like in financial services.
11:00So I think the government in the US is kind of leading the way,
11:03actually and thinking about how you can leverage
11:05AI while still protecting our rights as citizens.
11:08And that's a really important benchmark for everyone I think so,
11:11too, if we work from a position of the citizen owns their data.
11:16And then when you look at a supply chain of how data moves
11:19and how it's created at what points who owns that.
11:23But I do think is a democracy in which we're in Data
11:27and AI is becoming part of the fabric of what our future holds.
11:32And we're talking a lot about financial crimes
11:34because we happen to be in the payment space.
11:36The reality is it's beyond financial crimes,
11:38like misappropriation of imagery and content and misuse and that's
11:43where it starts to get even more concerning for me of who has
11:45the right to create and then who has the right to use.
11:48And let's remember, we're all governed by licenses and all kinds
11:52of regulatory environments and
11:53authorities that actually wants to check what we've done is correct.
11:56And if something happens later,
11:57anti money laundering laws and so forth,
12:00we have to be able to prove who actually did transact,
12:03who was the receiver, who was the sender, where did it go?
12:05So there is something there that the whole law the sort
12:09of the way that the laws are built has to take that into account.
12:12I totally agree, it has to go into every part of the fabric. Otherwise, it won't work.
12:16Which is so interesting because when they included
12:18First Amendment rights within this act.
12:21My first thought was almost similar to yours is that this is kind of
12:23advancing the First Amendment into the 21st century,
12:27if you will, only because, you know,
12:28who does control our data.
12:29And then perhaps it's not us even sync.
12:32And the way we were talking about before,
12:33deep fakes, where what was Katy Perry was like,
12:35in multiple places that Scala wasn't the Mcal thought so.
12:40So it almost feels like, you know, there's that argument that some
12:42of the amendments in the Constitution could perhaps be outdated,
12:45but it sounds as if acts like this are incorporating some of that.
12:49Yeah, I think it's starting to,
12:51but I think it's just the beginning.
12:52Right. I think we would all agree to that.
12:54I mean, I was looking at the news a couple of months ago,
12:56and I think there was an incident in Hong Kong, if I recall,
12:59where this is where AI makes me nervous,
13:01where a financial controller was sitting in a Zoom call with his staff.
13:05Do you guys hear about this? And effectively,
13:08they were all deep fakes.
13:09And one was acting as the CFO.
13:11And the CFO instructed them to move $25 million, and he did.
13:16And so you start to look at digital imagery.
13:19The Katie Perry example is kind of fun.
13:22It creates a lot of social buzz.
13:24But something when you're moving money that is
13:26at the detriment of a business, that's pretty problematic.
13:28And so how do you govern that? Right?
13:30Because it's not the banks fault that the money's been moved. I did its job.
13:33You know, it's Second I got issued
13:35from the right person who was approved to actually move the money.
13:37Amy foreign minister was in a meeting with Defate. What's actually.
13:41So how can you be proactive about data security measures?
13:44Like, what are the best practices adhering to the framework that's in place?
13:47I mean, you're in financial services,
13:48so, of course, that's heavily regulated.
13:50You know, you do the best to adhere to that?
13:52What are some best practices?
13:53Well, I think, for us, I think we are we complete
13:57more than 99% of the payments we do that we asked to do,
14:00and that's quite unique,
14:01even if we didn't do complex, but we do that.
14:03I think the first thing you do is you build an environment that's actually safe to be in.
14:07And again, Kar and I would spend a lot of time before we actually start transacting,
14:12checking each other, making sure that networks that we put together.
14:14It could take months before you partner.
14:15It could take months. It could take months.
14:17But once we're there, we know that we actually have the right,
14:19you know, guard rails around what we're doing.
14:21We know that we're in the right environments.
14:22And then we start looking at transacting.
14:24Then what we do is we look at every transaction, we screen them.
14:27There's all kinds of regulation and stuff that we can do,
14:30we partner with companies like
14:31yourselves that can actually help us do that screening on the go.
14:35And then, of course, afterwards,
14:37we also then look at monitoring what actually happened,
14:39who did this, over 24 hours?
14:41Who did what kind of thing.
14:42So we always track and try to find out what's going on.
14:45But it's always about tracking the data and trying to break in.
14:48Well, I mean, it feels like Ronan is always going to be a game of playing chess.
14:51You're just always going to have to be thinking,
14:52step ahead and trying to stay ahead of the technology.
14:55Yeah, you always have to do that.
14:57And that's why you know,
14:58having a posture where you're protecting your data and still
15:02able to use it and starting with that as the starting point, right?
15:05How do I get to this data?
15:06But how do I use it in a way that keeps it secure,
15:09keeps it private, and obviously keeps it compliant?
15:12That's really where you need to start, right?
15:14Especially when the data is sensitive,
15:15like in the world of fraud or
15:17healthcare and life sciences or national security, like you mentioned.
15:20Care, you pointed out something interesting in your notes here that if
15:23we move forward with trust inclusivity,
15:26secure identity, ecosystems, that
15:28can really protect and promote economic growth. What do you mean by that?
15:32I think data is the core at how we can make
15:36different decisions in order to help people get the goods,
15:40the services, the assets they need.
15:41Like, you were talking about moving money into Miramar.
15:43You know, look at some of the tornadoes that are happening here in the US,
15:46that if we actually understand who somebody is, how they were impacted,
15:50and what they were entitled to, and we can do that
15:52through data and data that's harmonized,
15:54Jill, not data that is effectively sitting in all these different repositories,
15:58We can help create a better society.
16:00I think we can enable businesses to be more successful.
16:02We can enable citizens in order to live the lives that they want to live.
16:06Take, like, unemployment insurance, for example.
16:09Like, when COVID happened,
16:11there was hundreds of billions of dollars that were probably
16:14misappropriated and misused because businesses and individuals,
16:18pretended they were somebody they were not or they were impacted somebody or not?
16:21Think about that money went into
16:23other aspects of our society and the problems that we have to solve.
16:26And so that's where I think, like, trust, transparency,
16:29using information in the right ways,
16:31protecting it against the bad actors can actually spur economic growth.
16:35That's a great example that you, Thomas,
16:36you moved money to MinMar via the Red Cross in a day.
16:40How long would that have taken if these tools were not in place?
16:44Weeks. Yeah.
16:45Weeks have that kind of time when you're dealing with the humanitarian.
16:48That's what you have to deal with until these things are in place, right?
16:51And if you think about the world, if you think about anything outside
16:54the societies that we're used to being part of where financial infrastructure works,
16:58and so on, these possibilities assignments, right?
17:02I mean, you can include parts of the world that
17:03before did not have financial infrastructure,
17:06you can take people who actually have the right to thrive to get on with it,
17:09to do their business and get along with their lives. That's what we have.
17:12Does it ensure it gets into the right hands?
17:15I was just going to used to take three weeks?
17:18Moving it could take second.
17:19Does it get into the right people that take.
17:22That is where I think, again, coming back to what a society and what sort of
17:25an ecosystem is because what we do is we partner up with people that we can screen.
17:29We know how to get them there.
17:31We can get the money into Man Mar.
17:32We cannot control what happened inside Man Mar course.
17:35Right. But by screening the partners that we choose,
17:37by screening the corridors we use, that takes months.
17:40It takes actually months, but six,
17:42seven, eight months, if it's difficult.
17:44But once we have them, we know exactly where we put them.
17:47How they then disperse from there,
17:48of course, is out of our hands.
17:50But that's how the ecosystem has to work for us.
17:53But we can put the money into a country,
17:55you know, in near real time.
17:56Yeah. I mean, really is quite fascinating.
17:58I mean, if you think about it on a timeline scale,
18:00just how quickly this has evolved.
18:01Right. Absolutely. And you know,
18:04I want to go back to a point that are made
18:06in terms of the use of data and helping people, you know,
18:08whether they're in Myanmar or in Nebraska with the tornadoes,
18:11the data that we all need is actually very siloed,
18:14not just in our own organizations,
18:15but across different companies across different countries as well.
18:19And so in thinking about how we could do this,
18:21collaboration has to be a part of it.
18:24There's no way that any one organization can go it alone.
18:27The AI application vendors need the data.
18:30You guys need the data, right?
18:32And so the way that we can continue to innovate and continue this pace of
18:36growth that you just referenced is through this collaborative posture,
18:41but it also has to be secured the entire way through.
18:43And the disruption that's going to happen in this industry
18:45is exactly driven by that possibility.
18:47It's not just the need. It's the possibility that we can now do it better.
18:50We don't have as many areas of,
18:52you know, where faults can happen.
18:53We don't have so many places that things can go wrong,
18:56and at the same time, it's better service, right? So we get on with it.
18:58But I think it's incumbent upon us as leaders,
19:02whether you're in enterprises or companies or NGOs or in, like,
19:06large markets and diplomacy that we
19:09effectively have to educate people on how their information is being used.
19:13At the end of the day, like, 'cause I'm willing to give up
19:16information if I know my life is going to be better,
19:18or I can make different choices.
19:19And that is something that's incumbent upon all of us.
19:22That's also where the trade offs. Sorry. The trade offs
19:24can kind of go away with new technologies as well,
19:27where I know that my information is going to be used for some purpose,
19:30but I know that it's going to be done securely.
19:32That's exactly.
19:34Consent isn't enough anymore, right?
19:36You need the security because there's bad guys everywhere trying to get at this data.
19:40And we need to work together to make that much
19:42more safe because we get fast, safe and simple.
19:45If we can make it like that, it's going to work for the good guys,
19:47and hopeful can keep that guy.
19:48Alright. We'll leave it on a positive note. Appreciate everyone's insight.
19:51Thanks for joining us on T. Thanks.
19:52Thanks for joining me from Market site.
19:53I'm Joe Melenro Global Market reporter at NASDAC.
20:01I.