At Teradata Possible 2023 in London, the company announced its new generative AI tool, The announcement raised a number of questions, some of which Teradata responded to in an email, and others arose as a result. Additionally, talking to delegates at its user event, it seems others also had questions about the announcement.

To get answers to some of the questions, Enterprise Times editor Ian Murphy, had a brief conversation with Hillary Ashton, Chief Product Officer, Teradata.

Why start with Microsoft?

Having listened to attendees and talked to a few, people are asking why start with Microsoft and not AWS? Also, why is there no mention of GCP?

Hillary Ashton, Chief Product Officer, Teradata (Image Credit: LinkedIn)
Hillary Ashton, Chief Product Officer, Teradata

Ashton replied, “We are actively working with AWS now to bring it to market with AWS. It is public record that Microsoft was first out of the gate with this technology, and that’s what we’re using on the backend here.” Ashton also clarified that the technology in question was Open AI.

In terms of AWS, Ashton said, “AWS has a couple of things that they thought of as well, and we’re actively working with them to do the exact same thing. It’s really a matter of when services are made available to us.”

When it comes to Google, Ashton made it clear that Teradata works with Azure ML, AWS SageMaker, and Google Vertex. She also reiterated that will be available on all three vendor platforms in time, despite the lack of mention of Google in the initial announcement. That timeframe is the first half of 2024, when the product is expected to go live.

Another reason Teradata led with Microsoft, according to Ashton, is because “we lead with where the market demand is. AWS and Azure are, by far, the market share leaders. And if and when things change, our focus will be customer-driven.”

Where will data be accessed from?

The announcement made it clear that this is about data only sitting in Azure. But Teradata has a lot of customers with on-prem and hybrid environments. We asked what happened to that data? Will customers not be accessing on-prem data?

Aston responded, “The example that I showed on stage, an architecture diagram, was on AWS SageMaker specifically for LLM. That’s a shopping cart example. It would be the same with Google Vertex. SageMaker, Azure ML, and Google Vertex are all the ways that we would do LLM model training at scale across those different CSPs.” and the software development lifecycle

None of the announcements and nothing on the Teradata website, talk about how will be integrated into customers’ software development lifecycle (SDL) tools. One of the use cases talks about generating code and models.

Other vendors doing a similar job here with code and model generation have said how they will work with the SDL. Why no attempt to make this integrated to the software development lifecycle for your customers and businesses?

Ashton pointed out that the goal of is to address several use cases. She commented, “Getting it out into the market to enable more people to understand how to use VantageCloud is relatively straightforward. The core enterprise features and capabilities that we have, when you start to think of the pipelining of AI into enterprise features, into model development and intermodal management. That’s absolutely part of the plan. That is the direction that we’re headed in.”

Does code generation mean greater productivity?

One reason for asking this question is that people are already using other products to generate code. There are questions over the optimisation of that code and the stability of that code. Customers will want to test and verify anything that is generated.

Ashton replied, “The journey for customers is going to be first understanding your data, what you have, building against that, and in a repeatable way. The primary way is the way that we build software, but it’ll just be faster. It means you need to have a development environment. You do need to verify.”

The level of productivity gain is questionable right now. A lot of vendors are talking it up, but they always have to make sales. Ashton said, “The range is 20, 40, 60%. They’re all over the map. It’s not going to just give it to everybody, and everybody’s going to be X percent better. There’s a lot of data that suggests that more experienced people can get to efficiency faster. They will know what they’re asking.”

She agrees that people will become more efficient, but the question is by how much. When it comes to people, she believes that the higher value is in human cognitive load. That opens up questions as to what this means for people. She commented, “It’ll help you be more efficient and learn more quickly. It’ll help you experiment faster, fail faster, and be more successful more often. But it doesn’t take the human out of it.”

Will this de-skill the data scientist or the data analyst?

In the past, we’ve seen increased tooling for network engineers and administrators take over more of the load. We haven’t seen people trained in new skills. You talked about increasing the cognitive load for people. How do we prevent the use of AI to do our queries to write the code within our data environments from actually deskilling those already in the environment? Because people will take the shortest, the laziest possible route.

Ashton replied, “It’s the classic people, process and technology. Technology does have a responsibility to be able to force you to prove something. You have to decide that you’re going to use a workflow process, so that’s a process part of technology. But technology can certainly be an enabler to maintain that level of responsibility and supervision. But it’s the only part of the equation.

“People can do good things, they can do bad things, and they can do less intelligent things. It’s quite interesting that with the attention that gen AI is getting, it’s also a knock-on effect that there’s a lot more attention to advanced analytics in general.”

AI is changing the boardroom conversation

One of the things that is changing is the boardroom view of analytics. According to Ashton, “AI is one of those massive opportunities for advanced analytics. Traditionally, analytics has been a bit of a back-office conversation. It’s been something that only the experts know anything about. This has made not just gen AI but advanced analytics, a boardroom board level and C-Level conversation. With that comes the entirety of that leadership organisation.”

Importantly, Ashton sees the board as also recognising some of the challenges that AI brings. The risk of a mistake that can cost the business dearly. However, given that it is engaged, she also sees them thinking through the risks.

Ashton said, “There’s much more attention to, what are the rules of play? How are we going to use it? Who can use it? Where can it go? What data can it look at? All of those things. A year ago, it was a two or three, and it’s a 10 or 11.”

The impact of AI on employees

Much of the conversation on the impact of AI in the workplace has been negative. The focus has often been on savings for the company and the loss of jobs for employees. The question is, how do companies avoid that? Letting people go because you have new technology has unexpected impacts. You lose all that organisational and domain knowledge, something that is rarely recognised until it is too late.

Ashton said, “We’ve been worried about people being out of the job due to technology. What you find is that one door closes or gets open, another door opens and suddenly you need proper skills. It’s fascinating. But, you need to be intelligent in how you’re training things.

“The fundamental thing is just because you can do something with technology doesn’t mean that you should. That’s where these knowledge workers really help organisations be successful. I don’t believe that, that is going to go away simply because the economic need to be successful is so high. If you fail, the cost of that financially and reputationally is so high that you will have people that you need from time to time.”

Enterprise Times: What does this mean?

The introduction of will have a real impact on Teradata and its customers. The challenge now is how to move from supporting one cloud service provider, Azure, to making sure all the big three are supported. Customers will also want to know more about how they will use and incorporate it into their day-to-day environment.

Sitting through two of the training sessions on the Teradata VantageCloud and ClearScape Analytics was interesting. Neither session had any focus on, presumably because it had just been announced. Similarly, there was no session devoted just to to talk about its benefits, how to deploy it or what it can deliver. That was disappointing as it was the only major announcement at the event.

What is clear is that Teradata is committed to the use of across a range of different use cases. What customers will be looking for now is more clarification on those use cases and how they benefit from them. Perhaps, when the company puts up a product page for, some of those questions will be answered.


Please enter your comment!
Please enter your name here