Acquisition Cloud Cloud News

HPE buys supercomputing giant Cray for $1.3 billion

Hewlett Packard Enterprise (HPE) is acquiring the supercomputer leader Cray in a deal worth $1.3 billion. The company aims to address the challenges driven by the explosion of data.

Cray holds its position among the top 100 supercomputer installations globally. Founded in 1972, Cray provides high-end supercomputing solutions for challenging and data-intensive workloads. The company delivers its supercomputing systems through the current generation XC and CS platforms, as well as the Shasta series platform.

The supercomputing leader recently announced an Exascale supercomputer contract with the US Department of Energy’s Oak Ridge National Laboratory. The contract was valued at over $600 million, enabling innovative research and AI at scale.

HPE will combine the deep supercomputing capabilities of Cray with its cutting-edge technologies, stepping ahead on the strategy to tackle the most data-intensive challenges of customers.

Modern technologies like AI, machine learning, big data and analytics, and the changing demands of customers for data-intensive workloads are exploding the amount of data generation. This is driving the expansion of high-performance computing (HPC).

With Cray acquisition, HPE will establish a broad portfolio of computing, storage, interconnect, software and services in the HPC and AI segments.

HPE already sees HPC as a key element of its vision and growth strategy. It provides a number of HPC solutions, which include HPE Apollo and SGI. The company will now be well positioned to strengthen its existing portfolio with the foundational technologies of Cray.

“Answers to some of society’s most pressing challenges are buried in massive amounts of data,” said Antonio Neri, President and CEO, HPE.

“Only by processing and analyzing this data will we be able to unlock the answers to critical challenges across medicine, climate change, space and more. Cray is a global technology leader in supercomputing and shares our deep commitment to innovation. By combining our world-class teams and technology, we will have the opportunity to drive the next generation of high performance computing and play an important part in advancing the way people live and work.”

Further, the companies together will be able to reach more segments of the market, to provide a wide range of solutions to enterprise, academic, and government customers.

Also read: HPE expands its storage networking portfolio to meet demands of next-gen technologies

“This is an amazing opportunity to bring together Cray’s leading-edge technology and HPE’s wide reach and deep product portfolio, providing customers of all sizes with integrated solutions and unique supercomputing technology to address the full spectrum of their data-intensive needs,” said Peter Ungaro, President and CEO of Cray.

“HPE and Cray share a commitment to customer-centric innovation and a vision to create the global leader for the future of high performance computing and AI. On behalf of the Cray Board of Directors, we are pleased to have reached an agreement that we believe maximizes value and are excited for the opportunities that this unique combination will create for both our employees and our customers.”

The transaction will close by the first quarter of HPE’s fiscal year 2020, subject to regulatory approvals and other customary closing conditions.

Cloud Cloud News Event

Discover new IT strategies, products and services, and learn from peers at the Interop 2019—an unbiased IT conference

Global information technology (IT) services market is expected to reach $748 billion by 2020. — Statista

The IT sector is a key driving force for the global economy and also has its cascading effect on other industries and markets.

Cloud computing, data analytics, software development, artificial intelligence, and other emerging technologies have now become the basic requirements for every business to survive in the era of digital transformation.

Innovation is knocking the doors of every industry. This has made all the enterprises today to adopt IT technologies to improve customer experience, optimize time to market, enhance operational efficiency, as well as reduce operating costs.

To provide enterprises and IT leaders a complete objective view of the things happening across all IT disciplines, Interop 2019 is coming to Las Vegas on May 20-23.

Interop 2019 conference- At a glance

With the theme— Keeping IT Real, the Interop 2019 is an unbiased IT conference that will help attendees to discover new strategies, products, and services. It will also help them hear from peers in the industry facing similar challenges and issues.Interop 2019

Interop is an independent platform where IT professionals can meet and learn from each other about everything that is going on in the industry. It will feature speakers from industry leaders like Google, Microsoft, Juniper Networks, Cisco, Delta Dental, and Red Hat.

The conference will witness all levels of IT and business professionals who are tasked with leveraging technology to drive their organizations forward.

Topics to be discussed at Interop 2019

The four-day conference will cover all the aspects of the IT industry that can help attendees to develop the necessary skills for managing a successful IT organization.

  • Cloud
  • Data & Analytics
  • DevOps
  • Emerging Tech
  • Infrastructure
  • IT Strategy
  • Professional Development
  • Security

Expert speakers at the Interop 2019

  1. Shawn Anderson, Executive Security Advisor, Microsoft
  2. Sonia Cuff, Cloud Advocate, Microsoft
  3. Khadija Mustafa, Sr. Director of Business AI, Microsoft
  4. Jim Carey, Product Management Lead, IBM
  5. Michael Melore, Cyber Security Advisor, IBM
  6. Doug Lhotka, Cybersecurity Architect, IBM
  7. Aurora Morales, Search Outreach Specialist, Google
  8. Ajay Chenampara, Domain Architect, Red Hat
  9. Renee McKaskle, CIO, Hitachi Vantara
  10. Matthew Oswalt, Network Reliability Engineer, Juniper Networks
  11. Hank Preston, Network Engineer, Cisco
  12. Jasdeep Singh, Security Engineer, AT&T
  13. Shekar Atmakur, Manager, KPMG
  14. Deborah Adleman, Director, US and Americas Data Protection Leader, EY
  15. Genetha Gray, Lead People Research Scientist, Salesforce

To check the full list of speakers, click here.

Why attend Interop 2019— the unbiased IT conference?

Unlike a typical vendor show, the Interop will provide an unbiased view of the things that are going on across IT sector and the way other IT leaders are keeping up with the rapid pace of change.

Here are the key benefits of attending the Interop IT conference:

  • Attendees will be able to take their career to the next level with IT education. This education will be completely based on truths, vendor-agnostic, and experiences from real-world.
  • Meet the industry leaders and disruptive newcomers in the IT sector.
  • Interact with the peers, exchange ideas, and have conversations with them in a relaxed and mutual environment.
  • Have fun in the Las Vegas—TechFair, yoga, 5K run, and attendee party.


To know about the passes & prices and register for the event, follow this link.

Daily Host News (DHN) is the official media partner of this event. Stay tuned with us for latest updates.


Dell EMC helps enterprises tame explosive growth of unstructured data with new Isilon and ClarityNow solutions

Dell EMC Isilon F810 Scale-Out NAS Platform Delivers Extreme Performance and Efficiency; Dell EMC ClarityNow Software Provides Visibility, Control and Mobility of Unstructured Data in the Data Center and in the Cloud

The explosion of unstructured data is demanding new approaches and capabilities for organizations to unlock their data capital and enable digital transformation. To tackle these challenges, Dell EMC today announced a new addition to its flagship Isilon All-Flash storage system, along with the release of new Dell EMC ClarityNow software to give organizations visibility, control and mobility of unstructured data both on-premises and in the cloud.

Many enterprises today are looking to accelerate business outcomes with powerful, next-generation unstructured data applications in areas such as data analytics, artificial intelligence and electronic design automation (EDA). These workloads often require the extreme performance of All-Flash storage. At the same time, many organizations are being pressured to tightly limit capital equipment purchases and reduce related IT operating costs.

It is in this environment that organizations also need to manage the growing volumes of unstructured data effectively so that their businesses can be more productive and efficient in their efforts to unlock the value of their enterprise data.

“Modernizing the IT infrastructure is an essential first step to driving digital business initiatives and managing all of their data more effectively,” said Jeff Boudreau, President, Storage, Dell EMC.

“The Dell EMC Isilon F810 scale-out NAS storage addresses these challenges by delivering extreme performance and efficiency to support demanding unstructured data workloads. And because nobody knows the value of data better than the people who create it, Dell EMC ClarityNow offers organizations a holistic data view across file and cloud storage, and allows end users to locate, use and extract value from their file-based data wherever it resides.”

Isilon F810 Delivers Extreme Performance, Efficiency and Capacity

The Isilon F810 delivers up to 250,000 IOPS and 15 GB/s bandwidth per 4U chassis with predictable, linear scalability up to 9M IOPS and 540 GB/s of aggregate throughput in a single 144 node cluster to meet demanding performance requirements.

With an inline data compression ratio of up to 3:1, the Isilon F810 enables organizations to reduce raw all-flash storage requirements and provides up to 33% more effective storage capacity per raw TB than major competitive all-flash offerings.

With a corresponding increase in storage density, the F810 provides an effective storage capacity of up to 2.2 PB per 4U chassis and up to 79.6 PB in a 144-node cluster. The highly dense storage solution can help reduce datacenter footprint and related costs including floor space, power and cooling.

In keeping with the longstanding Dell EMC commitment to providing future-proof Isilon storage solutions, the new Isilon F810 storage platform integrates easily into existing Isilon clusters without disruption and without the need to perform manual data migration. This is another example of how organizations can continue to rely on Isilon innovation for their future unstructured data storage needs.

Powered by the Isilon OneFS operating system, the Isilon F810 and other Isilon all-flash, hybrid and archive platforms can be combined into a single Isilon cluster that provides powerful advantages for modern IT environments.

With Isilon OneFS and its extensive multi-protocol capabilities, organizations can consolidate data, eliminate inefficient storage silos, simplify management, and support a wide range of applications and workloads on a single storage platform.

This also enables organizations to leverage powerful data analytics technologies to unlock the value of their data capital. To optimize storage resources and further lower costs, Isilon also offers automated storage tiering and cloud integration with a choice of public and private cloud storage providers.

Isilon F810 All-Flash Data Reduction Guarantee

Isilon storage systems are covered under the Dell EMC Future-Proof Loyalty Program, giving customers additional peace of mind with guaranteed satisfaction and investment protection for future technology changes.

Included in the program benefits, Dell EMC guarantees that for any new purchase of an Isilon F810, for a period of one year from the date of delivery, the Isilon F810 will provide logical usable capacity, including all data, equivalent to at least two times (2x) the usable physical capacity.

Dell EMC ClarityNow Provides Data Management for IT and Business Users

New Dell EMC ClarityNow data management software is designed to help organizations efficiently manage their file-based workflows. ClarityNow is a highly complementary solution to Dell EMC Isilon and ECS, enabling a unified global file system view across heterogeneous distributed storage and the cloud.

The software allows IT to gain better insights into enterprise file data usage and storage capacity, while also empowering end users and content owners with self-service capabilities to find, use and move files anywhere within the unified global file system.

These powerful features can help unlock the value of data capital and accelerate business outcomes by offering flexibility for users to index and gain visibility into billions of files and folders that would otherwise be trapped in storage siloes based on their physical location.

Also read: Dell EMC Unity Midrange Storage advances data reduction, data protection, and management in latest OS update


Dell EMC Isilon F810 and Dell EMC ClarityNow software are now available globally through Dell EMC and its authorized channel partner network.

“In today’s data-driven economy, business success increasingly depends on how well a company maximizes the value of its data, especially its file data. To support demanding, next-generation file workloads, file storage must combine massive scalability, performance and efficiency,” said Scott Sinclair, Senior Analyst, Enterprise Strategy Group

“By offering the Isilon F810 NAS platform, Dell EMC is directly addressing the challenges of modern file storage demands, not only with its enormous capacity and all-flash performance, but also with its impressive efficiency that will serve to lower hardware storage costs.” 

Image source: Dell EMC

Cloud Newss

Puppet buys data visualization startup Reflect for data visualization capabilities

Puppet, the DevOps automation company, recently announced that it has acquired data-visualization-as-a-service platform Reflect Technologies.

Founded in 2015, Reflect is a Portland-based startup which raised $2.5 million in a seed capital funding in 2016. The difference between other data visualization companies and Reflect is that it delivers the products as a service. It helps enterprises to add the analytics capabilities to their software and further deliver those capabilities to end-customers.

The acquisition of Reflect will help Puppet to speed up its product innovation, so that it can deliver its customers the modern and flexible analytics capabilities with automation platform.

“We’ve always helped customers mine valuable information about the state of their IT landscape and take action on it. With the innovation and talent in the marketplace today, we have an opportunity to improve that experience—giving customers new ways to leverage their data, and make faster, more informed decisions,” said Sanjay Mirchandani, CEO, Puppet. “Reflect brings the right pedigree of modern technology and unique talent to make this a perfect match.”

Puppet believes that the integrated solution will help companies make better decisions as their footprints rapidly expand and become more complex. It will provide them clear insights across every platform.

“Reflect helps organizations transform their raw data into visual stories that are easy to understand,” said Alex Bilmes, CEO and co-founder, Reflect. “With Puppet’s technology and expertise, we are able to capture an incredibly rich dataset, unlike any other available today. By joining forces, we will be able to deliver value from that data through beautiful charts, visualizations and interactive data tools.”

In September 2017, Puppet had acquired the continuous delivery automation software company Distelli to add container and application capabilities to its automation platform.

Also read: Donuts launches BL.INK platform to replace clunky URLs with meaningful short links

As a part of the acquisition, Puppet has absorbed all the products and employees of Reflect. The terms of the deal were not disclosed.

Cloud News News

Azure Time Series Insights— managed analytics service for IoT devices, now available in public preview

Microsoft recently announced the public preview of Azure Time Series Insights (TSI), a fully managed analytics, storage and visualization service used to manage IoT time-series data in cloud.

With the rapid rise in connected devices and advances in data collection, enterprises struggle to derive insights from the data that is generated by the connected solutions isolated geographically.

Businesses need to derive insights from several events generated in near real time. If the insights get delayed, it can impact the businesses, causing significant downtime.

The time series data solutions are critical for data scientists and process engineers in various industries to perform tasks like data analysis, storage, and KPI tracking.

Hence, it is necessary for organizations to optimize their operations and gain valuable insights in near real-time to gain a competitive edge.

The Azure Time-Series Insights will make it very easier for businesses to explore and analyze a large number of events sources like IoT. It can provide a near real-time view of terabytes of data and will enable users to validate IoT solutions, so that they can avoid downtime.

Using the new service, organizations can also detect hidden trends, spot anomalies and conduct root-cause analysis, without writing a single line of code. Users can integrate the capabilities of Time-Series Insights in their existing workflow or application.

TSI stores the data in memory and SSDs for up to 400 days, so that users can access it easily. It will provide out-of-the-box visualization through the Time Series Insights Explorer.

Furthermore, it will have integration with advanced analytics and machine learning tools including Spark and Jupyter notebooks. Microsoft said that it is fully integrated with cloud gateways like Azure IoT Hub and Azure Event Hubs.

Microsoft claimed that various enterprises from varied industries like windfarms, automotive, elevators, smart building, and manufacturing have been using its analytics service since its private preview.

Also read: Azure Databricks analytics platform now generally available within Microsoft public cloud

“Time Series Insights and other Azure IoT services have helped companies like BMW improve operational efficiency by reducing SLAs for validating connected device installation, in some cases realizing a reduction in time from several months to as little as thirty minutes,” said Joseph Sirosh, Corporate Vice President, Artificial Intelligence & Research, Microsoft.


Dell EMC’s new Machine and Deep Learning solutions to bring power of HPC and data analytics to enterprises

Dell EMC at SuperComputing conference 2017, announced its new machine learning and deep learning solutions to bring the power of HPC (high performance computing) and data analytics to the mainstream enterprises.

The new solutions combine HPC and data analytics to empower enterprises with opportunities in image processing, fraud detection, financial investment analysis and other similar areas through ready bundles for easier deployment.

Artificial Intelligence techniques like machine and deep learning have been increasingly deployed and used by enterprises but not many have the required skill set and expertise to manage such systems effectively. Here, Dell EMC’s new solutions built around its expertise in HPC and data analytics, offer customers the ability to leverage maximum insights from their collected data for faster and better performances.

Our customers consistently tell us that one of their biggest challenges is how to best manage and learn from the ever-increasing amount of data they collect daily. With Dell EMC’s high-performance computing experience, we’ve seen how our artificial intelligence solutions can deliver critical insights from this data, faster than ever before possible. Working with our strategic technology partners, we’re able to bring these powerful capabilities to all enterprises. When you think about what this means for industries like financial services or personalized medicine, the possibilities are endless and exciting.Armughan Ahmad, senior vice president/general manager, Hybrid Cloud and Ready Solutions, Dell EMC.

The new solution bring together optimized pre-tested and validated servers, storage and networking for machine and deep learning applications. Simplified identification, analysis and automation of data patterns will help customers use data insights in a variety of applications from facial recognition in security to studying human behavior in retail industry.

Customers will also be benefitting from the introduction of new Dell EMC PowerEdge C4140 server – supporting NVIDIA latest generation technology.

The Dell EMC Ready Bundles will be available in the first innings of 2018 via Dell EMC and its efficient channel partners, while Dell EMC PowerEdge C4140 will be available by December 2017.

With AI going mainstream, technology vendors like IBM, HPE and Dell EMC are pushing their services to be HPC and AI capable, and help developers and enterprises deploy HPC applications. While, HPE has been a leading name in HPC and supercomputing for years, other vendors are also increasing their efforts in the realm. This include IBM integrating its deep learning PowerAI enterprise software with its Data Science Experience.


HPE brings a modular and cloud-ready storage platform for enterprises of all sizes

Hewlett Packard Enterprise (HPE), yesterday launched HPE Superdome Flex, a scalable, modular and cloud-ready platform that can carry out high-speed analytics on large volumes of data for enterprises of all sizes.

HPE Superdome Flex has been designed with a unique modular design that offers unmatched memory capacity of up to 48 TB, and unparalleled scalability of up to 32 sockets which is 2.3 times more than the scalability of prior generation. It enables enterprises to go with the flow of data demands, and safeguard the critical workloads.

HPE’s mission critical in-memory computing platform has been designed for the future based on Memory-Driven Computing design principles that can enhance the performance of analytics up to 100 times, or even more.

“Customers want to harness all of their data to derive actionable insights in real-time to make more impactful business decisions,” said Randy Meyer, vice president and general manager, Synergy & Mission Critical Servers, Hewlett Packard Enterprise. “With HPE Superdome Flex, customers can capitalize on in-memory data analytics for their most critical workloads and scale seamlessly as data volumes grow.”

It scales seamlessly from 4 to 32 sockets in 4 socket increments, and reduces the costs up to 45% as compared to previous generation. The platform also offers over 50 times headroom for growth so that enterprises who start small can grow with the demands of their business. The enterprises can analyze data from the digital core to intelligent edge in real time with the in-memory design it offers.

The Superdome Flex integrates the reliability of HPE’s Integrity Superdome X, which HPE acquired last year for high performance data analytics.

Also read: HPE completes spin-off and merger deal with Micro Focus

HPE said that the Superdome Flex is a platform that offers proven reliability, availability and serviceability (RAS) capabilities and provides 99.999% single-system availability.

HPE Superdome Flex is generally available now.


AI and cognitive computing – spearheading enterprise digital transformation

Artificial Intelligence, robotics and machine learning concepts are changing the parameters of data extraction and analysis. With the onset of digital transformation, there’s a great need to process huge volumes of data into useful insights.

But with traditional data analytic methods, businesses are not be able to process the entire data, especially that which is in the form of images, videos, and human voice, collectively known as ‘dark data’. To process such kind of data, organizations need cognitive computing.

Originally, the concept of cognitive computing comes from cognitive science which include study of human brain and how it works. Cognitive computing aims to stimulate the human thought processing in a computerized way. It achieves that with the help of various self-learning algorithms, Natural Language Processing(NLP), Machine Learning and Automated Reasoning.
Companies are trying to deliver AI and cognitive based solutions to help organizations deal with various structured and unstructured data sets.

Companies are trying to deliver AI and cognitive based solutions to help organizations deal with various structured and unstructured data sets.

Also Read: Apple adopts the AI move, introduces new machine learning API framework

Companies like Facebook and IBM are rolling out new cognitive and AI solutions for better data analytics. Facebook’s team of researchers from FAIR (Facebook Artificial Intelligence Research), have reportedly introduced new capability to dialog agents – the capability to negotiate. Per them, the dialog agents with differing goals can participate in start-to-finish negotiations with other agents or people to arrive at common decisions or outcomes.

Also Read: New AI server designs to accelerate services of Facebook, Microsoft 

On the other hand, IBM has launched a suite of cognitive computing solutions with an objective to help financial institutions manage their regulatory and fiduciary responsibilities. The suite is powered by Watson which can be deployed on the IBM Cloud. It will help the financial professionals to – understand the regulatory requirement, deliver increased insights into current or potential financial crimes and manage the financial risk with new architectural data approach.

Find more info here.

Organizations need more such cognitive and AI enabled smarter solutions and tools for their digital transformation.