There are some who like the idea of a super intelligent computer driving their car. These switched-on, tuned-in, connected cars will be in constant dialogue with other vehicles, traffic systems and a myriad of other machines in the world of the Internet of Things. They will deliver safe driving; collision-free, smooth transportation with no delays and altogether a hassle free way to travel.
What’s not to like? Well many are wary saying that, without the human cognitive skills that enable us to make instant, often subconscious complex, decisions while driving it’s frankly a disaster waiting to happen. Abdicating responsibility for controlling a car to a computer is the road to disaster, they say, with technical malfunctions, cyber-attacks and, frankly, serious accidents proving you need the human touch. Computers may be intelligent, they argue, but they lack cognitive reasoning.
That’s the point. Drivers can drive because they learn and improve. Like many things in life, it comes down to experience and intelligence; the two components of cognitive reasoning.
Driverless-car dissenters base their views on the need for the computers to have been programmed in order to make decisions within defined parameters and for a series of pre-determined scenarios.
Proponents, however, argue that with AI and ML (artificial intelligence and machine learning), computers can match – even surpass – the cognitive skills of humans. Out goes the need for humans to conceive every possible scenario, in comes the ability for the machines to learn and reason for themselves, in super-fast time and based on masses of real-time data that they and other connected machines collect and share.
True, there are big-hitters who caution us to be wary of letting machines have such an upper hand. When the likes of Elon Musk and the late Stephen Hawking for example, raise concerns we need to take a long second look at AI and ML.
But that debate is for another day. Back to driverless cars. Think about it: a road full of cars that manage to keep moving through synchronising their speeds by communicating with each other. No more long jams caused by ‘volume of traffic’.
It’s about to happen according to reports from Japan. Technology firm ZMP is hoping to have a fleet of autonomous cars up and running in time for the 2020 Olympics.
The thing is, most of the technology needed for autonomous driving not only currently exists, but is already installed on most newer cars. We have cruise control, parking sensors and cameras to assist with reversing, emergency braking, anti-skid correction and automatic clutches. So cars now can control speed, change gear, steer, react to a sudden change in conditions and monitor external obstacles.
Perhaps the only thing we have still to see installed is the very soul of driverless cars; the ability for vehicles to be able to ‘talk’ to each other and to traffic monitoring and control systems. Those trials in Japan – the latest of many tests over the past year or so – seem to indicate we are about to break through the barrier.
Automated driving - using connected computers - is part and parcel of the Internet of Things; that rapidly-growing network of smart devices that communicate without human input. They generate and share data and use that to make informed, instant decisions on, for example, domestic energy use (using the washing machine during low demand, low cost times). The IoT is still in its relative infancy, but not for long. It’s about to come of age and the IoT is part and parcel of the wider AI and ML revolution.
Is there a difference between AI and ML? According to Nidhi Chappell, head of machine learning at Intel (reported in Wired): “AI is how we make machines intelligent, while machine learning is the implementation of the compute methods that support it. The way I think of it is: AI is the science and machine learning is the algorithms that make the machines smarter.” So for AI, the enabler is machine learning.
It still poses the question, is AI still just a concept in development, or will we start to see it moving into the mainstream? In reality, it’s already here; planes flying on autopilot, the Waze real-time traffic congestion app, email spam filters, Google search terms, shopping recommendations on Amazon, face recognition on Facebook are all examples of AI in action. Gartner has stated that AI will be one of the major trends which will shape future technologies.
Behind this lie the dramatic improvements in storage and connectivity. Everything is faster and more powerful meaning functions can now be performed that were not possible 10 years ago.
And at the heart of AI lies analytics. “AI is high performance computing. It can go from GPU to GPU (graphics processing unit) without going through a CPU. Everything is HPC based,” said Exertis Hammer’s Tom Cox. “It’s high performance servers and high-end workstations. Nvidia has made dramatic inroads with AI utilising mv links within video.” Nvidia recently launched its latest GPU, the Turing, drawing on its market-leading development in AI chips, 3D rendering and fluid modelling and Supermicro has just introduced an AI optimised GPU server. The hardware is keeping up with the AI revolution.
Bhushan Desam, Lenovo’s global AI business leader, quoted in Computer Weekly, agrees that the driver of AI will be HPC. It’s characteristics such as InfiniBand high-speed networking and GPU-based processing power that will enable enterprises make sense of data in hours, not weeks – so making AI of real commercial value.
This commercialisation of AI is one reason why Gartner says that AI technologies will be "virtually everywhere" within the next 10 years helped by the cloud and open-source. Everywhere… including the datacentre. Advancements in data management and solution integration create the conditions for machine learning, which is the next step in the evolution toward full AI, said Jon Nordhausen on Fiserv. “Machine learning is the point at which bots gain the ability to create new assumptions and predict outcomes based on a range of existing information, without being explicitly guided by human programming.”
AI is a far-reaching, wide-ranging subject but one aspect that is beyond debate is that it will (has already) generate a lot of data, data that needs to be stored in an easy-to-retrieve way. And that’s where specialists in servers, storage and networking solutions come into their own, distributors such as Exertis Hammer.
“AI, ML, IoT may be buzz phrases, but they are major innovations that will impact every sector,” said Exertis Hammer’s Tom Cox. “While they are all data heavy, the key aspect is the ability to retrieve information instantly.”
That’s why connectivity is key. As this article on CIO.com states: “Extreme volumes of data must be collected and processed in real time. Networks built as recently as 10 years ago weren’t required to collect, route, and process this vast amount of data at real-time speeds. Typical networks contained a web of hardware and cabling, a one-size-fits-all offering of bandwidth and throughput, which was far too cumbersome to handle today’s AI and machine learning applications.”
The answer to become AI-ready is to have the networks as well as the storage capacity.
Time will tell as to whether AI is as disruptive and altering to daily life as many predict. In the meantime, you can rely on Exertis Hammer to be informative, unbiased and at the forefront of market changes.
Published Date: 12/10/2018
Back to News