Skip to site menu Skip to page content

Daily Newsletter

06 December 2024

Daily Newsletter

06 December 2024

Veeva on technology migration and genAI adoption in life sciences

Veeva Europe president Chris Moore talks about the company’s new genAI-powered releases and its move from Salesforce to a proprietary CRM platform.

Phalguni Deswal December 06 2024

Last month, Veeva announced multiple generative artificial intelligence (genAI) capabilities in its Vault CRM such as Vault CRM Bot and Vault CRM Voice Control. This speaks to the life sciences industry's growing interest in developing technologies that incorporate genAI.

A recent GlobalData report noted that AI is a fast-growing industry, with every segment of the AI market to grow over the next decade and increase from being worth about $103bn in 2023 to exceeding $1trn.

Veeva Systems is an industry leader in cloud computing for the life sciences sector, utilised by leading pharmaceutical companies such as GSK, Pfizer, Novartis, and Boehringer Ingelheim. The company unveiled the release CRM Bot and Voice Control, along with a medical, legal and regulatory (MLR) review bot, at the 2024 Veeva Commercial Summit. The company has also designed a proprietary customer relationship management (CRM), which is expected to serve as a specialised system for the life sciences sector.

Medical Device Network sat down with Veeva Europe president Chris Moore at the recent Veeva Commercial Summit, in Madrid, Spain, which took place from 19 to 21 November. Moore talks about the company’s efforts in AI and learning from transferring large amounts of data to another software. He also shares his thoughts on AI regulation.

This interview has been edited for length and clarity.

Phalguni Deswal (PD): You made quite a few AI announcements at the Summit, could you expand on Veeva’s plan for incorporating AI in its product?

Chris Moore (CM): When talking about AI, you first have to start with data. Today’s data landscape is very patchy, meaning you can't go to one organisation and get integrated data. And so, for analytics and AI to work, you have to stitch that data together now in the same way that we fix that for software.

In the clinical space, for example, people are just used to the fact that the trial master file talks to a clinical trial management system (CTMS) talk to a study startup, and increasingly, we are stitching it together even more closely between sales, marketing, and medical. The catch is that that's not typically the way the data has worked, so we're doing a few things around it.

First, we've taken the decision that we will do data differently, and that means from, for example, healthcare professional (HCP) reference data, we won't have a different model for every market. We will have a model that supports every market but is consistent. We're then making sure that, for example, our deep data, that is our link products, directly connects with that. So, when we talk about a health care professional (HCP) in our reference data, it directly connects with our deep data.

Now, what we've realised is that no one company is going to be going to truly provide every single bit of data for everyone, and the gap in the industry is that there is no common definition. For example, what a healthcare professional should be described as. So, what we've made a commitment to do is to build a common data architecture, and the common data architecture looks to create a description of the data required for each of these areas. We're making it available to the industry – not just our partners and customers but the competitors as well.

The aim behind sharing this data is that there is a common description for the industry - the cost and complexity for the industry come down, and the overall efficiency and effectiveness of the industry go up. And then when we start talking about things like AI, that becomes much more achievable.

For Veeva’s AI strategy, we understand that this is an emerging space with lots of innovation. So, we think it's wrong to say there is one magic answer to this because there clearly isn't. And instead, what we tried to do is create a three-tier approach to this.

First, we're creating our own bots where we have unique capabilities, skills, or technology and where it's more efficient for the industry. Our earliest examples were in the clinical space, with a trial master file- when masses of documentation come in, it is a painful process to categorise that and put that in the trial master file. We have a bot that does that now, and that's just actually part of the core product.

We're looking at additional bots elsewhere. Here at this summit, we've announced three key areas. One is a MLR review bot, which takes the laborious grind out of the prechecks and supports that process, but the ultimate approval remains with a human.

The second one is the AI bot. Now this is something that we've been really waiting for the right time and the right way to do it. Here we are creating the framework whereby you can bring your own natural language model and you can apply that to our data, and you can surface the results through our solution. We are currently working with Microsoft on this.

Our aim with this product is that if you have a co-pilot, you can leverage that investment, but we are keeping it open to other models that might exist out there.

The final area that we have is around voice processing, where we are partnering with Apple, who are doing something that's particularly interesting work around speech recognition. So, if you think about most AI right now, it gets shipped over to the server. The server does the processing, and you get the answer. What Apple is doing is that they're using the device itself to do that and to build in next-generation speech recognition, which we plan to build into our devices too.

Part of this process is a high-speed data application programming interface (API). The idea here is to provide near real-time data to feed AI engines. We have great data in our platform and our data cloud, but you have to provide it at a suitable speed to cope with the ingestion needs of the AI technology. So, we're providing a direct data API to support that.

PD: There have been calls regarding AI regulation and accountability. What are your thoughts on the European Union’s AI regulation?

CM: AI regulation and accountability is a nuanced topic with so many different layers. If you look at the EU Act, it's one of the first of its kind and probably the first major attempt to try and put some guardrails to aid innovation.

But there's a lot of interpretation involved there. If you distil it down, one of the things it talks about is the impact of the outcome. For example, if you are using AI to decide on the safety systems of a nuclear power plant, that's very important and has to be very accurate. But if the AI is acting as an adviser to a human, the restrictions can be a bit softer.

So, I think, being able to categorise AI into its level of impact according to the regulations, but also considering, the correct consents for the input data, especially, if you combine that data and draw insights from it, and are you within the regulations, not just within the overall European Union, but also within individual markets.

So, the GDPR data is something you really have to have to keep an eye on. And then the question is, as that evolves, are you evolving your consent to meet that? We are working very closely with our legal team and regulators to understand that and try and design accordingly.

To consider AI accountability, you take a step back and consider the natural language models - you can ask the same question twice and get different answers, which in a highly regulated space is not great. Now, consider the medical-legal review space, for example, it is very structured thing and therefore, we can automate, and review.

There are other things that we can infer. Humans are not always great at reviewing large amounts of data, So, automating that process elevates the role of the human and helps in better decision-making. You do the pre-checks with the bot, but then a human still has to go through and review that and give final consent to it. By doing that, we see the opportunity for AI to take the grind out of the process, but you're not prematurely putting too much emphasis or reliance on it until it's more proven.

PD: You are in the process of migrating your customers from Salesforce CRM to your proprietary system, Vault CRM. Can you share your learnings from the process?

CM: Two years ago, we announced that we were moving our underlying platform. Since the early days of business, the CRM was built on the Salesforce platform but we built around it on our own vault platform. While Salesforce is a great tool for cross-industry usage what we realised for life sciences is it doesn't meet all the needs of these customers, particularly, combining documents, data, and processes all in one.

To address this, we built the whole platform, but we kept the CRM because it worked, and we had a large market share. We'd reached a point where both technologically and contractually, we had gone as far as we could go on the Salesforce platform. We had more users now on our own Vault platform that was designed for life sciences.

As of today, we have 32 companies live and using Vault CRM. Although we did not plan to do any migrations of existing Viva CRM customers to Vault CRM until next year, we will have seven by the end of this year.

A year ago, we talked about the major organisations, including GSK, Bayer and Novo Nordisk in the US moving to Vault CRM, with migrations starting at the beginning of 2025. GSK plans to move within 2025, with 19,000 users worldwide in all of their global markets live by the end of September 2025.

To accommodate the migration, we built a migrator tool, which is a software product that we have been running as a software product to try and make the process as effortless as possible. We can execute those migrations, and the process is getting smoother and smoother.

The area that we knew was going to be difficult, was the area of existing interfaces and reports where people had customised the tool, and we had to recreate them. Initially, we thought to tidy up the current environment and then move to the new environment. But as we have gone through, we have realised that it is much better for most organisations to move what they have today as quickly as possible, and then to innovate on that new platform.

Uncover your next opportunity with expert reports

Steer your business strategy with key data and insights from our latest market research reports and company profiles. Not ready to buy? Start small by downloading a sample report first.

Newsletters by sectors

close

Sign up to the newsletter: In Brief

Visit our Privacy Policy for more information about our services, how we may use, process and share your personal data, including information of your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.

Thank you for subscribing

View all newsletters from across the GlobalData Media network.

close