supertortoise
Cloud Computing

Cloud Supercomputing (part 3): Factors in development?

In this four-part series we consult a panel of 12 experts to determine the role of cloud supercomputing now and in the future. Part three looks at the factors that are likely to impact development.


“The main factor will be access to resources matched to the needs of the problem.  There are things that can be done on a leading “TOP500” machine such as the existing “Titan” machine or the upcoming “Summit” machine scheduled for 2017, that is expected to be five times faster or more than Titan. Some problems just cannot be addressed on lesser resources in a meaningful way or within a meaningful timeframe.  There is no reason why cloud-based access cannot be used for ‘supercomputing resources’ of this level of capability, and it is already being done today. It is more a question of who has access to what, where, and how.”

Peter ffoulkes 451 research

 

“Among those factors obstructing the rise of cloud supercomputing, perhaps most important are cost and capacity. To move supercomputer computations to the cloud would currently be too labour intensive and expensive.

Similarly maximum surge capacity is too small to make a meaningful difference at the moment.

It is sometimes said that these challenges can be overcome within the next five years through the development of technology which automatically packages computations for the cloud, but it remains to be seen whether this kind of automation is really possible.”

Dariush Marsh-Mossadeghi, director of technology strategy & architecture at DataCentred


“Using the HPC equivalent of data sovereignty, clusters of high spec VMs need to be co-located to ensure low latency and to choose from the best storage and network speeds.

In some ways, the challenge is not so much getting hold of these resources, but making them available to teams according to agreed policies, keeping track of who's using what resource, for how long and for what project in order to manage costs. Just as enterprises need to protect customer data and constrain costs, research teams need to protect sample data, IP and stay within budget. Ideally, an organisation would have a portal which managed access to these resources.”

Ian Finlay, CEO at Abiquo


 “The rise of cloud supercomputing will be impacted by many factors. The entire industry – from software to networking to hardware and facilities – is going to have to collaborate effectively, much as the consumer and B2B cloud sectors have done, and improvements will need to be made to supporting infrastructure.

As compute increases and we head to exascale, a huge challenge in itself, we can no longer ignore the energy intensity of the cooling or power transformation – in many cases this is half the energy bill. Instead, the industry needs to take a more holistic approach to supercomputing. This includes infrastructure investment and energy efficiency, and that will go beyond PUE to overall performance per watt. As cloud supercomputing develops, we will also need to align with smart cities. It’s vital we look at how our large computing facilities can integrate with society, using waste heat effectively to reduce overall costs and using the compute resource to help solve real time, real world issues.”

Peter Hopton, founder of Iceotope


“In order for cloud supercomputing to really take off in the future businesses need to better understand the security aspects of different cloud environments. They also need to explore another set of crucial elements in their cloud strategy such as - cloud-to-cloud backup, disaster recovery as service and other security services – that can be implemented to safeguard sensitive company data stored in the cloud.”

Alex Guillen, go to market marketing manager at Insight UK

 

“With compute and storage increasingly being treated as a commodity, cloud supercomputing has become a viable resource for more firms wanting to get in on the “research game” due to lower price points. 

However, caution must be taken that compliance is not impacted; some cloud providers do not reveal the location of their data.  If an end-user’s data needs to remain in-country, this could affect its choice of provider.”

Asad Malik, Product manager at MTI


Read the rest of this series:

Cloud Supercomputing (part 1): What is it?

Cloud Supercomputing (part 2): Where is it heading?

Cloud Supercomputing (part 4): How will it help society?

PREVIOUS ARTICLE

« See-through trucks: Samsung joins the road accidents fight

NEXT ARTICLE

C-Suite still ignoring digital disruption »

Poll

Do you think your smartphone is making you a workaholic?