Cloud Interviews

“New Dell EMC offerings to help organizations modernize IT infrastructure and unlock the value of their data capital”— Sri Seshadri, Consultant Product Manager, Dell EMC

Enterprises today are seeking powerful, next-generation unstructured data applications in areas such as data analytics, artificial intelligence and electronic design automation (EDA) to accelerate their business outcomes.

To run these workloads, organizations require extreme performance, but are mostly constrained in terms of budget and need reduced IT operating costs. In order to help organizations address this explosive growth of unstructured data, unlock their data capital cost-effectively and enable digital transformation, Dell EMC launched new Isilon and ClarityNow solutions.

Dell EMC Isilon F810 all-flash scale-out NAS provides extreme performance and efficiency to support demanding unstructured data workloads while significantly lowering the effective cost of all-flash storage solutions. And Dell EMC ClarityNow data management software enables businesses to locate, access and manage data quickly, no matter where it resides – across file and object storage, in the data center and in the cloud – and thereby accelerate business outcomes.

Read as we interview Sri Seshadri, Consultant Product Manager at Dell EMC to know more about these solutions.

1. Studies show that 80% of the data today is unstructured, and it is growing both in volume and variety. What are the main data management challenges organizations face in such a scenario?

In an environment where many IT budgets are tightly controlled, organizations with traditional IT infrastructure are increasingly challenged to keep pace with rapidly increasing data storage requirements and growing management complexity. Traditional IT environments typically have a heterogeneous storage infrastructure often with dedicated infrastructure for specific applications and workloads.

With rapid data growth, these growing “silos” of storage can sometimes be highly inefficient with poor storage utilization while some may have over capacity “hot spots” that are unable to meet required performance SLAs. With these operational challenges, the rapid growth of unstructured data represents a significant opportunity for businesses – in an increasingly digital world, unstructured data is becoming for many organizations, an enormously valuable asset – something we refer to as “data capital” – that can enable them to gain new insight, identify new opportunities and accelerate their business.

2. How can organizations tackle these challenges and unlock the power of unstructured data?

To increase efficiency, simplify management, and unlock the value of this “data capital”, organizations need a modern data storage infrastructure that can enable a digital transformation of their business and accelerate business outcomes. To achieve this, organizations can begin by consolidating their unstructured data onto a single storage infrastructure that can support a wide range of applications and workloads with varying performance requirements.

This will enable organizations to eliminate costly, inefficient storage silos and greatly simplify management. This approach also enables organizations to create a “data lake” that can be used for data analytics and business intelligence initiatives. Dell EMC Isilon scale-out NAS is an ideal choice for this type of infrastructure. Isilon storage solutions scale easily – from 10s of terabytes to 10s of petabytes – to easily keep pace with rapid data growth while remaining simple to manage no matter how large the data environment becomes.

With 80% storage utilization and a highly dense architecture that reduces the storage footprint in the data center, Isilon storage efficiency can help organizations dramatically lower costs. To meet varying performance requirements, Isilon offers All-Flash, Hybrid and Archive platforms that can be combined into a single cluster with automated storage tiering and simple cloud storage integration to further optimize storage resources. Isilon’s extensive multiprotocol capabilities allow organizations to consolidate data and support a wide range of workloads – including powerful data analytics technologies – on a single storage platform that can help organizations unlock the value of data capital.

3. Dell EMC recently announced a new addition to Isilon All-Flash storage system. Please tell us about this solution and the new Dell EMC Isilon F810.

The Isilon F810 combines the extreme performance of all-flash with all of the advantages of the #1 scale-out NAS platform and a new, inline data compression capability that expands effective storage capacity and density to enable customers to achieve new levels of efficiency while lowering costs. Isilon F810 expands effective storage capacity and density with a data compression ratio of up to 3:1 depending on the workloads.

With the high-speed hardware-based inline data compression, the Isilon F810 provides up to 3 times more effective storage capacity than the existing Isilon F800 all-flash platform and up to 33% more effective storage capacity per raw TB than major competitive all-flash offerings.

With an effective storage capacity of up to 2.2 PB per 4U chassis and up to 79.6 PB in a 144 node cluster, the Isilon F810 provides a highly dense storage solution that reduces datacenter footprint and related costs including floor space, power and cooling. Isilon F810 all-flash scale-out NAS is ideal for demanding unstructured data workloads including EDA, data analytics and artificial intelligence.

4. How does Isilon F810 guarantee satisfaction and investment protection for future technology changes?

In keeping with the longstanding Dell EMC commitment to providing future-proof Isilon storage solutions, the new Isilon F810 storage platform integrates easily into existing Isilon clusters without disruption and without the need to perform manual data migration. This is another example of how organizations can continue to rely on Isilon innovation for their future unstructured data storage needs.

The Isilon cluster scales easily in minutes with the addition of the F810 and keeps storage and data management simple, no matter how large the data environment becomes.

5. Can you please tell us more about Dell EMC Future-Proof Loyalty Program?

All Isilon storage systems are covered under the Dell EMC Future-Proof Loyalty Program, giving customers additional peace of mind with guaranteed satisfaction for three years and investment protection for future technology changes.

Included in the program benefits, Dell EMC guarantees that for any new purchase of an Isilon F810, for a period of one year from the date of delivery, the Isilon F810 will provide logical usable capacity, including all data, equivalent to at least two times (2x) the usable physical capacity.

6. Along with Isilon F810, the company also released the new Dell EMC ClarityNow software. What’s the role of the new software in data management?

Dell EMC ClarityNow software enables organizations to locate, access and manage data in seconds, no matter where it resides – across file and object storage, in the data center and in the cloud. ClarityNow data management software manages file-based workloads and is a highly complementary solution to Dell EMC Isilon and ECS, enabling a unified global file system view across heterogeneous distributed storage and the cloud.

The software allows IT to gain better insights into enterprise file data usage and storage capacity, while also empowering end users and content owners with self-service capabilities to find, use and move files anywhere within the unified global file system. These powerful features can help unlock the value of data capital and accelerate business outcomes by offering flexibility for users to index and gain visibility into billions of files and folders that would otherwise be trapped in storage silos based on their physical location.

Also read: Dell EMC Unity Midrange Storage advances data reduction, data protection, and management in latest OS update

7. Wrapping up, what are the 3 main benefits of these new solutions along with tackling data management challenges?

  • Dell EMC Isilon F810 all-flash scale-out NAS provides extreme performance and efficiency to support demanding unstructured data workloads while significantly lowering the effective cost of all-flash storage solutions.
  • Dell EMC ClarityNow data management software enables businesses to locate, access and manage data quickly, no matter where it resides – across file and object storage, in the data center and in the cloud – and thereby accelerate business outcomes.

Both of these new powerful Dell EMC offerings are designed to help organizations modernize their IT infrastructure, gain new levels of efficiency and unlock the value of data capital

Cloud Cloud News Newss

Pure Storage unveils new data protection solution built for flash and cloud

All-flash data storage solutions provider Pure Storage has launched a new data protection platform called ObjectEngine. Built specifically for flash and cloud, the new platform will enable faster recovery and data centricity for modern enterprises.

Data protection has become crucial for enterprises in the era of digital transformation where every business is migrating to cloud. Traditional approaches to data protection don’t work for modern enterprises that have adopted cloud.

Also, the legacy backup solutions are around ten times slower at the time of restoring data. And enterprises today want to access and restore all the data in real-time. As per an industry research, 57% of enterprises will change their existing data protection solution within next two years.

Pure Storage’s ObjectEngine is a modern data protection solution that will address contemporary data protection challenges for enterprises, restore data on-demand, and enable its applications in cloud.

In September 2018, Pure Storage had acquired StorReduce, a cloud-first software-defined storage solution to manage large scale unstructured data. The company said that it has built ObjectEngine on cloud-native technologies of StorReduce.

This makes ObjectEngine a unified solution for cloud and on-premises and provides seamless and rapid backup and recovery across both.

“For too long, backup and protection has been an insurance policy rather than a strategic asset. In today’s ultra-competitive environment, organizations need every advantage possible to ensure they get the most value out of their data. That means fast recovery to ensure data is back in production use as quickly as possible — modern organizations simply cannot afford to wait days or weeks,” said Matt Burr, General Manager for FlashBlade, Pure Storage.

“ObjectEngine offers an evolved, cloud-centric approach to business continuity that can help forward-looking customers do more with their data.”

Also read: Dell EMC helps enterprises tame explosive growth of unstructured data with new Isilon and ClarityNow solutions

The new solution will be available in two configurations—ObjectEngine//A and ObjectEngine//Cloud.

The ObjectEngine//A will offer 25 terabyte/hr (TB/hr) backup performance and 15 TB/hr restore performance. The company claims that it can reduce storage and bandwidth costs by up to 97%.

On the other hand, ObjectEngine//Cloud is a secure and enterprise-ready platform that will enable openness, integration and data portability. It will come with a single namespace for data across hybrid cloud. This configuration can scale to provide over 100 TB/hr performance.

Pure Storage ObjectEngine is expected to be available in first half of 2019.


Kaspersky Lab moving core infrastructure from Russia to Switzerland; opening first Transparency Center

By the end of 2019, data from customers in Europe will be stored and processed in Zurich

As part of its Global Transparency Initiative, Kaspersky Lab is adapting its infrastructure to move a number of core processes from Russia to Switzerland. This includes customer data storage and processing for most regions, as well as software assembly, including threat detection updates. To ensure full transparency and integrity, Kaspersky Lab is arranging for this activity to be supervised by an independent third party, also based in Switzerland.

Global transparency and collaboration for an ultra-connected world

The Global Transparency Initiative, announced in October 2017, reflects Kaspersky Lab’s ongoing commitment to assuring the integrity and trustworthiness of its products. The new measures are the next steps in the development of the initiative, but they also reflect the company’s commitment to working with others to address the growing challenges of industry fragmentation and a breakdown of trust. Trust is essential in cybersecurity, and Kaspersky Lab understands that trust is not a given; it must be repeatedly earned through transparency and accountability.

The new measures comprise the move of data storage and processing for a number of regions, the relocation of software assembly and the opening of the first Transparency Center.

Relocation of customer data storage and processing

By the end of 2019, Kaspersky Lab will have established a data center in Zurich and in this facility, will store and process all information for users in Europe, North America, Singapore, Australia, Japan and South Korea, with more countries to follow. This information is shared voluntarily by users with the Kaspersky Security Network (KSN) ( an advanced, cloud-based system that automatically processes cyberthreat-related data.

Relocation of software assembly

Kaspersky Lab will relocate to Zurich its ‘software build conveyer’ — a set of programming tools used to assemble ready to use software out of source code. Before the end of 2018, Kaspersky Lab products and threat detection rule databases (AV databases) will start to be assembled and signed with a digital signature in Switzerland, before being distributed to the endpoints of customers worldwide. The relocation will ensure that all newly assembled software can be verified by an independent organisation and show that software builds and updates received by customers match the source code provided for audit.

Establishment of the first Transparency Center

The source code of Kaspersky Lab products and software updates will be available for review by responsible stakeholders in a dedicated Transparency Center that will also be hosted in Switzerland and is expected to open this year. This approach will further show that generation after generation of Kaspersky Lab products were built and used for one purpose only: protecting the company’s customers from cyberthreats.

Independent supervision and review

Kaspersky Lab is arranging for the data storage and processing, software assembly, and source code to be independently supervised by a third party qualified to conduct technical software reviews. Since transparency and trust are becoming universal requirements across the cybersecurity industry, Kaspersky Lab supports the creation of a new, non-profit organisation to take on this responsibility, not just for the company, but for other partners and members who wish to join.

Kaspersky Lab’s commitment

As a leading global cybersecurity solutions provider, Kaspersky Lab has always been committed to the most trustworthy industry practices, including strong protection for transmitted data, strict internal policies for data access, ongoing security testing of its infrastructure, and more. With this new set of measures, Kaspersky Lab aims to significantly improve the resilience of its IT infrastructure to any trust risk – even theoretical ones – and to increase its transparency to current and future clients as well as to the general public.

Commenting on the process move and transparency center opening, Eugene Kaspersky, CEO of Kaspersky Lab, said; “In a rapidly changing industry such as ours we have to adapt to the evolving needs of our clients, stakeholders and partners. Transparency is one such need, and that is why we’ve decided to redesign our infrastructure and move our data processing facilities to Switzerland. We believe such action will become a global trend for cybersecurity, and that a policy of trust will catch on across the industry as a key basic requirement.”

Learn more about Kaspersky Lab transparency principles and the Global Transparency Initiative here

Images Source: Kaspersky 

Cloud News News

Azure Time Series Insights— managed analytics service for IoT devices, now available in public preview

Microsoft recently announced the public preview of Azure Time Series Insights (TSI), a fully managed analytics, storage and visualization service used to manage IoT time-series data in cloud.

With the rapid rise in connected devices and advances in data collection, enterprises struggle to derive insights from the data that is generated by the connected solutions isolated geographically.

Businesses need to derive insights from several events generated in near real time. If the insights get delayed, it can impact the businesses, causing significant downtime.

The time series data solutions are critical for data scientists and process engineers in various industries to perform tasks like data analysis, storage, and KPI tracking.

Hence, it is necessary for organizations to optimize their operations and gain valuable insights in near real-time to gain a competitive edge.

The Azure Time-Series Insights will make it very easier for businesses to explore and analyze a large number of events sources like IoT. It can provide a near real-time view of terabytes of data and will enable users to validate IoT solutions, so that they can avoid downtime.

Using the new service, organizations can also detect hidden trends, spot anomalies and conduct root-cause analysis, without writing a single line of code. Users can integrate the capabilities of Time-Series Insights in their existing workflow or application.

TSI stores the data in memory and SSDs for up to 400 days, so that users can access it easily. It will provide out-of-the-box visualization through the Time Series Insights Explorer.

Furthermore, it will have integration with advanced analytics and machine learning tools including Spark and Jupyter notebooks. Microsoft said that it is fully integrated with cloud gateways like Azure IoT Hub and Azure Event Hubs.

Microsoft claimed that various enterprises from varied industries like windfarms, automotive, elevators, smart building, and manufacturing have been using its analytics service since its private preview.

Also read: Azure Databricks analytics platform now generally available within Microsoft public cloud

“Time Series Insights and other Azure IoT services have helped companies like BMW improve operational efficiency by reducing SLAs for validating connected device installation, in some cases realizing a reduction in time from several months to as little as thirty minutes,” said Joseph Sirosh, Corporate Vice President, Artificial Intelligence & Research, Microsoft.

Cloud Datacenter Hosted Cloud Apps Hosting News Partnership

Bit Refinery Chooses Digital Fortress’ Seattle Data Center to Host its Hadoop Hosting solution

Cloud infrastructure as a service provider  Bit Refinery has chosen Digital Fortress’  Seattle data center to host its newly launched Hadoop Hosting solution.

BitRefinery’s Hadoop Hosting solution allows companies to store massive amounts of data through a very low cost solution. Organizations can also analyze the data without having to purchase expensive, massively parallel computing (MPC) appliances.

We selected Digital Fortress as our colocation partner for our newly launched service because the company was able to support our need for mission-critical infrastructure. In this environment we can support our customers need to store and analyze vast amounts of data while ensuring their data is replicated and secure.
– Brandon Hieb, Managing partner, Bit Refinery.

Other features of the new solution are fully dedicated servers, private high-speed network and full console control.

“Hadoop Hosting will provide Bit Refinery’s customers with an affordable way to get up and running with this new technology,” said Paul Gerrard, CEO, Digital Fortress.

“With Digital Fortress as its colocation partner, customers are assured the highest standard of uptime and full redundancy backed by a dedicated technical staff on-site 24x7x365, among other value-add services,” he added.

“We selected Digital Fortress as our colocation partner for our newly launched service because the company was able to support our need for mission-critical infrastructure,” said Brandon Hieb, Managing partner, Bit Refinery.

“In this environment we can support our customers need to store and analyze vast amounts of data while ensuring their data is replicated and secure,” he added.

Earlier this year,  Bit Refinery launched it’s Disaster-Recovery-as-a-Service and a new cloud solution – vDev™.

Cloud Event News Technology

BYTE into BIG DATA Summit to Take Place in Mumbai, India on November 21- 22, 2013

Business intelligence firm BE Summits today announced that “BYTE into BIG DATA Summit, ” an event focused on new technology and upcoming innovations in the Big Data arena, will take place on the 21st and 22nd of November, 2013 in Mumbai, India.

The BYTE into BIG Data Summit will see key decision makers from cross Industry, involved in Data Analysis, Data Storage and Data Management,  get together, discuss, understand and elucidate how enterprises can effectively understand, analyze, protect and utilize the massive surge in Data coming their way.

Some of the key topics that’ll be addressed during the two day summit are:

  • Regulatory & Compliance for BIG DATA.
  • BIG DATA Trends & Challenges.
  • Monitoring BIG DATA (Big Data Analytics).
  • Hadoop Framework – Importance of Hadoop in BIG DATA.
  • Security in BIG DATA.
  • Big Data Cloud Computing.

Some of the speakers who’ve confirmed their presence at the summit include:

  • Lakshmi Narayan Rao (Lux Rao) Chief Technologist – Cloud, Big Data & Mobility – Technology Services, Hewlett Packard.
  • Ahmed Aamer- Executive Director, SKY Computing.
  • Vijay Sethi – Vice President & Chief Information Officer, Hero MotoCorp Ltd.
  • N Jayantha Prabhu – Chief Technology Officer, Essar Group.
  • Kersi Tavadia – Chief Information Officer, Bombay Stock Exchange.

BE Summits have announced that they would be limiting to 100 key decision makers attending the Summit.

For more information on the event, please visit BYTE into BIG Data Summit’s website or  drop an email at

Cloud Cloud News Hosting New Products News Technology

Bit Refinery Now Offers Hadoop Hosting; Allows Massive Data Storage at Low Costs

Cloud infrastructure as a service provider Bit Refinery today announced that it now offers a new dedicated service – Hadoop® Hosting™.

The news comes a few weeks after Bit Refinery launched a new cloud solution – vDev™.

The new service allows companies to store massive amounts of data through a very low cost solution. Organizations can also analyze the data without having to purchase expensive, massively parallel computing (MPC) appliances.

Other features of the new solution are fully dedicated servers, private high-speed network and full console control.

Hadoop Hosting allows customers to forgo the large, up-front capital needed to get started with this new technology. Customers can store and analyze huge amounts of data with in-depth querying capabilities at an extremely low cost. In addition, because of the architecture of Hadoop, our customers’ data is replicated to other “nodes” so their data is safe. This service provides a great way for companies to ‘jump into’ Hadoop. – Brandon Hieb, Managing partner, Bit Refinery.

Bit Refinery’s Hadoop Hosting uses the Dell C series line of servers. Each server contains 4 “nodes” with the following configuration:

  • Dell C Series Dedicated Server.
  • Dual, Quad Core CPUs.
  • 24GB RAM.
  • 3, 1TB SATA Hard Drives (JBOD).
  • Redundant 1GB Ethernet Connectivity.
  • Full console control via KVM over IP with Media Connectivity.

Each node is priced at $300/month. More the number of the “nodes”, the faster the results and the safer customers’ data is because each file is broken up into “blocks” and replicated among different nodes.

Bit Refinery has released a graphic illustrating a typical implementation using traditional physical servers for Hadoop nodes while relying on the redundancy of virtual servers powered by VMware:

Bit Refinery’s new service is ideal for following companies:

  • Companies and government agencies wanting to try out Hadoop technology without having to purchase their own equipment.
  • Enterprise software sales organizations for use with customer proof of concepts and internal development/R&D.
  • Small-to-enterprise-sized businesses with large amounts of data that want to launch their own data analytics projects.

Some of the benefits of Hadoop Hosting are:

  • Businesses, researchers and data analysts can process and analyze vast amounts of data.
  • The hosted service reduces IT CAPEX with no hardware to purchase.
  • Organizations may install any variation of Hadoop.

“Hadoop Hosting allows customers to forgo the large, up-front capital needed to get started with this new technology,” said Brandon Hieb, managing partner, Bit Refinery.

“Customers can store and analyze huge amounts of data with in-depth querying capabilities at an extremely low cost. In addition, because of the architecture of Hadoop, our customers’ data is replicated to other “nodes” so their data is safe. This service provides a great way for companies to ‘jump into’ Hadoop,” he added.

Here is more information on Hadoop Hosting from Bit Refinery.

Earlier this year, Bit Refinery launched it’s Disaster-Recovery-as-a-Service.

Cloud Datacenter Hosting New Products News Technology

Windstream Hosted solutions to Open 72,000-Square-Foot Enterprise Class Data Center in Charlotte, NC

Managed Hosting provider Windstream Hosted Solutions today announced undergoing construction of it’s  new 72,000-square-foot enterprise-class data center in Charlotte, North Carolina that is going to open later this year. Windstream’s fourth DC in Charlotte and seventh in North Carolina, the new facility will have multiple 10,000-square-foot data center suites  delivered in a phased approach.

Aimed to help businesses manage their resources in a cost-effective manners,  the Charlotte DC will provide Windstream’s full suite of cloud computing, dedicated hardware, data storage, and managed services.

Some key features of the new facility are:

This newest data center represents our commitment to providing customers with the highest level of services and reliability in order to meet the changing needs of their businesses. – Chris Nicolini, SVP, Data Center Operations, Windstream Hosted Solutions.
  • 2N modular design for  capacity expansion.
  • True A-Side & B-Side power distribution via fully redundant 2N critical infrastructure.
  • Fully staffed on-site Network Operations Centers (NOCs),  providing 24 x 7 x 365 facilities and network monitoring, security, technical and remote-hands support.
  • Access to multi-tenant infrastructure, such as Enterprise Cloud, EMC and NetApp SAN/NAS, F5 GTM/LTM platforms, as well as Cisco and Juniper network and security platforms.
  • High Efficiency Chillers with 450 tons of 2N day 1 cooling capacity expandable to 1350 tons.
  • Fire Suppression, including both FM200 and Dry-Pipe (double interlock pre-action fire suppression) and VESDA early detection system.
  • High capacity, burstable, carrier-neutral Internet connectivity via multiple  Tier I Internet providers.
  • 3000kVA of day 1 utility capacity easily expanded to over 12MVA.
  • 1215kW of day 1 usable UPS capacity expandable to over 3600kW.
  • Customer workspace and build rooms.

“Many of our customers have already realized the benefits of integrated, personalized solutions from Windstream Hosted Solutions, and demand for those services has continued to increase significantly year over year,” said Chris Nicolini, Senior Vice President, Data center operations, Windstream Hosted Solutions.

“This newest data center represents our commitment to providing customers with the highest level of services and reliability in order to meet the changing needs of their businesses,” he added.

Windstream Hosted Solutions offers managed and dedicated hosting solutions like public, private, and hybrid cloud products. It also offers  voice and data services such as VoIP, SIP trunking and  MPLS.

For more information, click here.


PMC Improves Data Centers Through Its Second-Generation SSD Caching Solution

DAILYHOSTNEWS, October 18, 2011 – PMC-Sierra, Inc., the semiconductor innovator transforming storage, optical and mobile networks, has today announced the Adaptec Series 6Q with maxCache 2.0 SSD caching which is meant to accelerate data center and cloud computing application performance. MaxCache 2.0 employs an intelligent, learned-path algorithm to radically improve performance of HDD-based arrays. This is achieved by use of solid state drives (SSDs) which caches frequently accessed data.

The latency reductions delivered with maxCache 2.0 provide a direct end-user benefit for data centers. A report by Shopzilla says that a five-second speedup results in a 10 percent increase in revenue and a 50 percent decrease in hardware. From a different approach, Amazon cites that every 100 milliseconds in latency can result in a one percent drop in sales.

“Data centers have multi-faceted needs that PMC uniquely addresses through our innovative solutions and ability to work with customers to optimize our products in their environment for their application,” said Jared Peters, vice president and general manager of PMC’s Channel Storage Division. “With the introduction of the Series 6Q, PMC has reinforced its commitment to deliver intelligent solutions that address the distinctive pain points of data centers.”

The Series 6Q with maxCache 2.0 also improves IOPS and allows data centers to perform more transactions with a single server and reduce the overall number of servers required. This leads to a reduction in capital and operational expenses, including power, cooling and maintenance costs.

Also read: Colocation Data Center Singapore

The Series 6Q eliminates the maintenance and environmental costs of lithium ion battery-based solutions. This elimination of lithium ion batteries and the reduction in power and cooling through maxCache 2.0 makes the Series 6Q is the industry’s “greenest” SSD caching solution.

“As a member of The Green Grid, PMC is embracing our mission and the goal of standardizing more efficient use of resources in data centers,” said Mark Monroe, executive director, The Green Grid. “The Green Grid’s mission is to improve resource efficiency in business computing, which incorporates technology that can reduce energy consumption and lower environmental impact.”

About Adaptec by PMC
Adaptec by PMC storage products protect, accelerate, optimize and condition data.

About PMC
PMC is the semiconductor innovator transforming networks that connect, move and store digital content.


Atlantis Computing Recognized For Its Outstanding Services

DAILYHOSTNEWS, October 13, 2011  – Atlantis Computing, the leader in Virtual Desktop Infrastructure (VDI) storage and performance optimization solutions, today announced it has been recognized on The Leaderboard as one of the Top 100 Up and Coming Innovators, Movers and Shakers. The company was named to the Top 100 list based on its innovative technology for virtual desktops. The list serves as an acknowledgement to vendors gaining mind share in Q3 of 2011.

VDI is the fastest growing approach to allowing IT departments to centrally host and manage user desktops on a virtual machine residing in the data center. Users can access their server-hosted virtual desktops, from anywhere there is connectivity and from a broad range of end-user devices using one of several remote display protocols.

End users can enjoy the benefits of a virtual desktop experience that is better and faster than a physical PC through the Atlantis solution which not only cuts VDI CAPEX up to 75 percent, but also enables the use of lower cost storage options including SAN, NAS or local disks. It also makes it possible for IT organizations to use anti-virus agents to overcome the performance and security hurdles.

Atlantis ILIO virtual desktop optimization solution complements VMware, Citrix and Quest virtual desktop deployments to overcome the storage, performance and scalability challenges mostly associated with VDI deployments. The Atlantis ILIO software allows organizations that are deploying virtual desktops to ensure the success of their project by slashing the amount of storage normally required, boosting performance to ensure user acceptance and enabling security to be deployed without impacting server density.

Also read: Colocation Data Center Singapore

Bernard Harguindeguy, CEO, Atlantis Computing said: “Companies like Google, IBM and Amazon are transforming the cloud computing space and to be included with this caliber of cloud providers is a real honor. With Atlantis ILIO our customers can finally afford to deploy virtual desktops and at the same time win user support for their project.”

About Atlantis Computing
Atlantis Computing is privately held and funded by El Dorado Ventures, Partech International and Cisco Systems with headquarters in Mountain View, California, and offices in London, England and Bangalore, India. The Atlantis ILIO software complements Citrix, VMware, and Quest VDI solutions to cut VDI costs and deliver a desktop that is faster than a PC.

Page 1 of 2
1 2