The heads of Hewlett Packard Enterprise (HPE) and Nvidia offered their perspectives on the emergence of AI during the IT company’s annual Discover event last month, views which went beyond hype by addressing some of the practicalities of implementing the technology.
Nvidia co-founder and CEO Jensen Huang joined HPE president and CEO Antonio Neri (pictured) on the keynote stage to discuss an AI collaboration between the companies, a talk which then went further by exploring how to get the best from the technology while addressing ethical and security concerns.
Huang branded AI as the “greatest fundamental computing platform transformation in 60 years”, continuing an evolution which began with the advent of general purpose computing, a shift to processing on CPUs and GPUs, and a move from an “instruction driven” model to “intention driven”.
“Every single layer of the computing stack has been transformed”, Huang explained, noting this has led to a shift in the type of applications which can be written and developed, along with the process for doing so.
The changes in computing stack are what makes AI “such a big deal”, Huang argued, explaining the technology is helping “a whole new industry” to emerge, one which builds on today’s $3 trillion IT industry by producing software with embedded intelligence.
Huang said so-called AI-factories are “producing intelligence in high volume”, pitching the IT sector as being at the beginning of this new era.
Neri’s assessment was broadly the same. He pointed to humankind’s eternal thirst for knowledge as a driving force while emphasising the long experience of Hewlett Packard in developing breakthrough technologies and supercomputers.
“Everything we have accomplished have led us to this moment today. We are again leading a revolution”.
“When the human mind is in concert with AI, there is nothing we cannot achieve together”, Neri noted.
The executives covered various elements needed to ensure such unity of people and AI: Neri explained trust is a fundamental pillar, pointing to the establishment of principles by HP Labs in 2019 covering privacy, people and social responsibility.
“At HPE we are stewards of AI, upholding our principles with unwavering integrity”.
“We are all deeply aware of the transformative power of AI, but AI is hard,” he added, noting the technology was complicated and came with many risks.
Neri argued there is merit in not rushing into AI just to avoid being left behind: “innovation at any cost is very dangerous”.
But he referenced various initiatives HPE is already working on, including university research programmes into nature and clean energy involving petabytes of data, along with medical research where aligning privacy with the ability to share information is a key requirement.
Neri said collaboration is key to AI, but explained recent focus on generative iterations “requires even greater innovation”.
A hybrid cloud set-up will be needed because “AI is not one single thing”, instead comprising hundreds of microservices, multiple models, “specific accelerators” and the connection of “many different data sources”, all of which are highly distributed across hybrid IT infrastructure.
“At the same time, you must maintain data governance, regulatory compliance and security”, all factors Neri argued make “in premise hybrid clouds” a must have.
The heat is on
Both executives highlighted the issue of heat generation from computing systems in the AI age, with liquid cooling cited as the most effective means to address this.
Huang (pictured, left) explained liquid cooling systems deliver higher performance, “but also lower infrastructure costs” because various heat-transfer overheads are removed.
“So, the future of liquid cooling is going to result in everything from better performance, lower infrastructure costs and lower operating costs.”
Neri noted AI is pushing the boundaries of accelerated computing silicon, which he said will “require more power density,” adding “AI systems generate a lot of heat and waste”.
The HPE executive again pointed to his company’s long experience of tackling such issues, stating it brings two decades of experience of liquid cooling systems to the table, garnered in its work on supercomputers.
We want information
Neri explained data had now become so important it is on the brink of being counted as an asset on company balance sheets.
But he noted some barriers to fully employing data, not least of which is speed of access.
Neri said many enterprises are prioritising the use of genAI at the edge, which involves “training existing models and running inferencing at the edge to solve business problems” spanning content generation, product design and optimisation, and enhancing customer service by using conversational AI.
Of course there was a pitch of HPE having the expertise needed to handle the masses of data require to train large language models, but Neri explained the speed of access to data is a perennial issue, referencing research by IDC which indicated few IT leaders can access information in real time.
When the human mind is in concert with AI, there is nothing we cannot achieve together
Antonio Neri – president and CEO HPE
“If you don’t have your data at the source then you cannot get it fast enough to use it. GenAI is a distributed workload, connecting all your data is essential which means you require a strong networking foundation from edge to cloud”.
Neri said HPE’s pending acquisition of Juniper Networks will bolster its capabilities, creating “an industry leader with a modern, more secure and AI-driven networking portfolio”. For service providers, this will set-up a boost in automation and orchestration of virtualised network functions “to scale their infrastructure, reduce cost and accelerate the speed of delivery for new services”.
Enterprises, meanwhile, will enjoy simpler methods to manage complex infrastructures including private 5G.
Huang picked up on inferencing needs, explaining employing this method for information stacks simplifies the use, protection and deployment of information by enterprises.
Connecting data to a model stack results in the latter element becoming “the intelligent AI that can interact with, chat with, retrieve information from your company’s private” information.
“What we are looking to do in all our companies is to turn our corporate intelligence into digital intelligence and once we do that, we connect our data and our AI flywheel so we collect more data, harvest more insight, create better intelligence which allows us to provide better services”, Huang explained.
In turn, companies can reap more productivity from internal processes, “run faster, do things more efficiently” and at larger scale and, “very importantly, create new products”.
The need for collaboration remained a key theme of the keynote, a feature highlighted by the executives appearing alongside one another. But there was also a sense of realism and pragmatism to their presentations, a potentially positive development as companies begin to work through the hype cycle and look to the practicalities of AI deployments.
Comments