Categories
Articles Cloud

Top 5 priorities to master competencies in selecting, buying and deploying cloud services

One of the most complicated process for enterprises is to select, buy, and deploy public cloud services and tools, while avoiding the pitfalls.

Since there are numerous cloud providers out there, the infrastructure and operations (I&O) leaders find it difficult to select the right cloud provider. Also, no two cloud providers are the same.

Choosing and managing cloud offerings is a critical skill for I&O leaders to master, given cloud computing’s central role in next-generation initiatives such as digital business, the Internet of Things (IoT) and artificial intelligence,” says Elias Khnaser, Senior Research Director at Gartner.

“Years from now, you don’t want to look back with regret, as the choices you make can have a lasting impact.”

To avoid looking back with regret, Gartner has identified five priorities that will help I&O leaders to select, buy, and deploy cloud offerings.

1. Analyze technical and architectural details of cloud providers

Technical architecture is critical for every organization, as it needs to integrate with the workflows, not only now but also in the years to come. Also, the technical architectures of most cloud platforms are large, complex and difficult to understand.

Gartner said that it is important to determine the key components of the architectures, the way they work together and affect the overall solution. Technical categories that should be prioritized should include self-service, elasticity, network access, security, regulatory compliance, and operational capabilities.

2. Understand the way cloud services measure up against requirements

I&O leaders should consider how the cloud services stack up against the key requirements and criteria of their organization. For instance, before choosing a standard cloud offering, the main requirements can be simplicity, performance, feature set, and costs.

The key requirements can be slightly different for infrastructure as a service (IaaS) and application platform as a service (aPaaS). For IaaS, the key consideration areas should be compute, network, storage, security and support. Whereas, for aPaaS, these should be application architecture components, developer tools, virtualization and hosting architecture, code deployment, life cycle management, scalability, and availability.

3. Learn about cloud provider’s approach to security and compliance tools

Nowadays, the enterprises are increasing adoption of cloud services. With that, the requirement to meet regulatory and data privacy rules to govern the process of data has also increased.

For example, EU’s General Data Protection Regulation (GDPR) applies to all enterprises that process and hold personal data of European organizations.

As per Gartner, I&O leaders should understand the approach of cloud provider to data privacy and compliance regulations.

4. Set criteria for evaluating cloud management solutions

Because of the increasing adoption of cloud services, the cloud providers today offer new cloud-native offerings. When there are several services and tools, it is important to have cloud management platforms and tools.

Hence, organizations should create a criterion to evaluate cloud management solutions, and a strategy to guide their selection and implementation processes.

5. Prepare for cloud service governance

Enterprises generally give more importance to time-to-functionality decisions as compared to planning for long-term stability and support. However, I&O leaders should take time to prepare for cloud service governance by understanding the process and architecture options.

Also read: Public cloud services revenue in India will reach $2.5 billion in 2018: Gartner

“An effective cloud account governance and design strategy provides I&O leaders with the ability to effectively scale, avoid sprawl, and reduce networking and management complexities. This helps avoid the need for disruptive retrofitting of the infrastructure months or years after it has transformed into a critical production platform,” says Khnaser.

Categories
Articles Cloud Cloud News Newss

Organizations have 14 misconfigured public cloud services running at any given time: McAfee study

On average, an organization experiences over 2,200 misconfigured incidents every month in their public cloud instances, according to a report by McAfee. These cloud instances include infrastructure-as-a-service (IaaS) and platform-as-a-service (PaaS).

For the report, titled Cloud Adoption and Risk Report, McAfee analyzed billions of events in anonymized cloud production use to find the current state of cloud deployments and expose risks.

“Operating in the cloud has become the new normal for organizations, so much so that our employees do not think twice about storing and sharing sensitive data in the cloud,” said Rajiv Gupta, senior vice president of the Cloud Security Business, McAfee.

“Accidental sharing, collaboration errors in SaaS cloud services, configuration errors in IaaS/PaaS cloud services, and threats are all increasing. In order to continue to accelerate their business, organizations need a cloud-native and frictionless way to consistently protect their data and defend from threats across the spectrum of SaaS, IaaS and PaaS.”

Key findings of McAfee’s Cloud Adoption and Risk Report:

  • 21% of data in cloud is sensitive

According to the report, organizations consider around a quarter of their data in the cloud as sensitive. This shows that putting sensitive data in cloud has increased by 53% year over year. Organizations are at risk of the sensitive data being stolen or leaked in case a misconfigured cloud incident occurs.

Today, more and more organizations are using public cloud for providing new digital experiences to their customers. But the organizations that haven’t adopted a cloud strategy are at risk of losing their most valuable asset. A right cloud strategy can include data loss protection, configuration audits, and collaboration controls.

Further, organizations without cloud strategy are also exposing themselves to risk of noncompliance with internal and external regulations.

  • 20% of sensitive data in cloud runs through email services

No doubt, the cloud services help organizations accelerate their business by making the more agile with resources, offering ability to scale and opportunities for collaboration.

Cloud services like Office 365 increase the effectiveness of collaboration, that involves sharing. However, uncontrolled sharing can result in data exposure. The report found that 22% of cloud users share files externally, an increase of 21% YoY.

Sharing of sensitive data with an open, publicly accessible link has increased by 23%, whereas, sensitive data sent to personal email address has increased by 12% YoY.

Top collaboration and file sharing services

For last five years, an Office 365 application is dominating the list of top 10 collaboration services, followed by G Suite services.

  • Enterprises using IaaS and PaaS had 14 misconfigured services running at any given time

Currently, 65% organizations globally are using some form of IaaS, while 52% are using PaaS.

Since, it is costly to buy and maintain servers, organizations go for IaaS and PaaS. It gives IT teams the ability to spin up virtual machines, containers or functions as a service, as per the need.

For IaaS and PaaS, organizations are trusting Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP) the most. AWS clearly leads the pack with 94% of all IaaS usage share. Azure and GCP account for 3.7% and 1.3% shares respectively.

Additionally, 78% of organizations are using a multi-cloud strategy, leveraging both AWS and Azure together.

McAfee study found that on average, enterprises using IaaS and PaaS had 14 misconfigured services running at any given time, resulting in an average of 2,269 misconfiguration incidents per month.

  • 80% organizations experience at least 1 compromised account threat per month

As per the report, most of the threats to data in cloud results from compromised accounts and insider threats. On average, an organization generates over 3.2 billion threat events per month in the cloud. These threat events include compromised account, privileged user, insider threat etc. Such events have increased by 27.7% YoY.

80% of all organizations report that they experienced at least one compromised account threat per month. Whereas, 92% of organizations has stolen cloud credentials for sale on the Dark Web.

Also read: 25% of businesses had their data stolen from public cloud: McAfee Study

For security of sensitive data in cloud storage, file-sharing and collaboration applications, enterprises will need to first understand the cloud services they are using. Further, they must identify which services hold sensitive data, and how that data is being shared and with whom.

When they know these things, they can push suitable security policies to prevent highly sensitive data from being stored in unapproved cloud services. They also need to continuously audit and monitor their IaaS and PaaS configurations.

Download the full Cloud Adoption & Risk Report here.

Categories
Articles Business

Doing cloud is not important, doing it right is important : 6 key takeaways from State of DevOps 2018 Report

DevOps industry continues to evolve and better itself, as adopting DevOps is not a choice for the organizations anymore, rather it is a competency that’s required to perform better. This and many other important assessments have been made by DORA (DevOps Research & Assessment) in its report, Accelerate: State of DevOps 2018: Strategies for a New Economy.

The 2018 state of DevOps report has linked software delivery and operational performance i.e. SDO performance with the organizational performance. It has been observed that strong software development (Dev) and software operations (Ops) along with cloud are the key drivers of successful software delivery, which is an important component in driving the performance up in every kind of organization, planning digital transformation. 

The report has unlocked many new findings that will help businesses judge their software delivery strategies, improve quality of their IT teams, resource management and gain better productivity.

What differentiates elite teams from low performing ones?

Deployment and delivery of software in complex systems is a bit difficult. Teams who have the ability to develop and deliver software quickly are considered elite as they are better able to build ways of increased customer adoption and satisfaction, while keeping up with the regulatory demands and compliance too.

Thus, software delivery speed and operational performance act as the key differentiators for the teams because they enable organizations to utilize software to deliver improved outcomes.

Elite teams have more frequent code deployments, faster lead time from commit to deploy for the primary application, lower change failure rate and are fast to recover from incidents.

Elite teams have more likely superior and strong availability practices like:

  • They keep promises and assertions about the software product or service, they are operating on.
  • They ensure easy access of a product or service to the end users.

Not only this, elite teams also adopt best practices, while transforming their businesses digitally – adopt DevOps practices, implement cloud infrastructure, use open source software and adopt high-performance driving technical practices.

Let’s delve into these practices in detail.

6 key takeaways : best practices for an organization’s digital transformation journey

1. Implement DevOps practices and capabilities to drive organizational performance and quality outcomes 

Organizations and teams who aim to embark on technology transformation journey have two important goals -organizational performance and quality outcomes. DORA research finds that software delivery performance and availability unlock competitive advantages like improved productivity, increased profitability, and market share as well as customer satisfaction while increasing the abilities that assist in organizational goals achievement.

Report states that those who implement DevOps practices do more value – added work. With more automation, DevOps using companies do less manual work, spend less time in fixing problems, reworking, remediations, and customer support. Above all this, they free their technical staff to do proactive or new work, in which they are able to design, build, and work on features, tests, and infrastructure, in a structured and productive way to create value for their organizations. Thus, they are able to develop, deliver and increase customer adoption and satisfaction.

2. Focus on the implementation of Cloud infrastructure, adopt essential patterns that matter

As per DORA survey, 67 % of the respondents were using some kind of cloud platform to host their primary application or service.

DevOps 2018
Source : DORA DevOps Report 2018

But, despite the huge amount of time and money spent by the organizations in cloud technologies’ implementation, they fail to leverage the capabilities provided by different cloud computing platforms.

Thus, doing cloud is not important, doing it right is important.

  • Adopt essential characteristics of cloud computing

Report findings state that teams that adopt essential cloud characteristics are 23 times more likely to be elite performers.

On demand self-service, broad network access, resource pooling, rapid elasticity, and measured service are five essential characteristics of cloud computing that impacts software delivery performance.

devOps 2018
Source : DORA DevOps Report 2018
  • Implement PaaS (Platform as a Service) and use infrastructure as code

Organizations that use PaaS are 1.5 times more likely to be in the elite performance group.

DORA report states that libraries and infrastructure defined by PaaS can be used by the teams for application deployment into the on – demand cloud using a single step. They can perform self-service changes on-demand for databases and other services, as and when required by their applications.

Also, using infrastructure as code paradigm, helps to reproduce and change the state of environments in an automated fashion from information in version control and eliminates manual infrastructure configuration.

  • Adopt cloud native design practices and use containers

Cloud native applications are designed around the constraints inherent to cloud systems, so cloud native applications must be resilient, elastic, easy to deploy and manage on-demand.

devops 2018
Source : DORA DevOps Report 2018

 So, implement cloud native architectures, as those who implement cloud native architectures are 1.8 times more likely to be elite performers.

3. Avoid outsourcing as it hurts performance

There is no doubt that outsourcing by function provides flexibility and saves money, still it is rarely adopted by elite performers.

As per DevOps report, low-performing teams are 3.9 times more likely to use functional outsourcing (overall) than their elite performance counterparts, and 3.2 times more likely to use outsourcing for application development, IT operations work, or testing and QA.

 4. Adopt best technical practices

Key practices that elite and high performers use essentially for successful technology transformations and in achieving SDO include monitoring, observability, continuous testing, database change management, and integrating security.

Other technical practices organizations should adopt include use of version control, deployment automation, continuous integration, trunk-based development, and a loosely coupled architecture.

devops 2018
Source : DORA DevOps Report 2018

5. Focus and adopt Lean and agile practices

Lean product management positively affects availability. Lean product management approaches help businesses get fast customer validation, which in turn enable them to respond quickly and integrate feedback timely. Below figure gives a glimpse of lean product management practices:

devops 2018
Source : DORA DevOps Report 2018

6. Influence organizational culture through autonomy, leadership and learning

 Organizations should not ignore the importance of their people and culture during digital transformation. They should clearly communicate outcomes and goals to their teams and give them autonomy in their work as it leads to feelings of trust and understanding.

 Also create climate for learning. Elite performers are 1.5 times more likely to consistently hold retrospectives and use them to improve their work. DORA report states that teams that leverage findings from retrospectives to implement changes to tooling, processes, or procedures see the strongest impacts.

So, build your business with autonomy, trust, learning, and information flow, this will help accelerate and improve profitability, productivity, and customer satisfaction.

Thus, CIOs should help their companies invest in smart software delivery processes and strategies to boost their digital transformation journey.

Source: https://cloudplatformonline.com/2018-state-of-devops.html

Categories
Cloud Cloud News

Enterprise SaaS market generates $20 billion quarterly, with Microsoft and Salesforce as dominants : Synergy Research

According to second quarter data by Synergy Research, the enterprise SaaS (software-as-a-service) market helps software vendors to generate $20 billion quarterly revenue. This number is further growing by 32% each year.

Synergy stated that SaaS market has gained maturity in many ways, but there are still a number of factors that limit the growth of SaaS market in years to come. Currently, the SaaS market accounts for less than 15% of total enterprise software spending.

The SaaS market is not growing as faster as IaaS (infrastructure-as-a-service) and PaaS (platform-as-a-service) markets, but it is substantially bigger and will remain so with rapid growth across all segments and geographic regions.

Microsoft is currently dominating the global SaaS market, holding a market share of 17%. It overtook Salesforce in the SaaS market because of its leadership in high-growth collaboration segment.

Salesforce continues to lead the CRM segment, but its growth is relatively low in SaaS segments. Adobe, Oracle, and SAP are following Microsoft and Salesforce in SaaS market. Among Adobe, Oracle and SAP, Oracle achieved the highest growth rate.

The top five vendors in enterprise SaaS market account for more than 50% of the market. The next ten vendors account for another 26% of the market, where ServiceNow, Google, ADP and Workday witnessed the highest growth rate.

“There is a fascinating battle for SaaS playing out, with traditional enterprise software vendors slugging it out with born-in-the-cloud vendors like Workday, Zendesk, ServiceNow and Dropbox,” said John Dinsdale, a Chief Analyst at Synergy Research Group.

“The latter group are helping to rapidly transform the market, but the more traditional players like Microsoft, SAP, Oracle and IBM still have a huge base of on-premise software customers that they can convert to a SaaS-based consumption model. Meanwhile Cisco and Google too are making ever-bigger inroads into the SaaS market, via Cisco’s collaboration apps and software vendor acquisitions and Google’s G Suite.”

Also read: 6 million new domain names registered in second quarter 2018, as total count reaches 339.8 million globally: Verisign

Synergy noted that SaaS market remained quite fragmented, with different vendors leading each of the main market segments.

Image source: Synergy Research

Categories
Articles

Cloud Hosting Comes Out as a Winner! Here’s Why…

When cloud technology first arrived, nobody was sure how significant the outcome would be on the businesses. Eventually it started creating a buzz in the market. Cloud technology has transformed in a way how a business operates, in addition to time and cost saving. This has resulted in it becoming one of the most commonly used technologies over the last decade. Right from SMEs to mid-size to a fully established business -everyone is shifting to the cloud. That’s because this technology increases scalability and serves higher performance.

Cloud technology has expanded its roots in the hosting environment as well. In simple terms, to understand cloud hosting, one can refer to the policy of ‘Divide and Rule’. Your website needs some resources to run. These resources are divided across multiple machines throughout the network that can be made available as per the need.

Unplanned Traffic Spikes

A sudden increase in website traffic may be a good sign for your business. It ultimately will help in generating more leads, which means more sales. But have you ever thought that a sudden traffic rise could do more damage than profit? Think of a glass of water being filled more than its capacity. Can it hold more water than its capacity? No, Right? The water will start overflowing in that case.

Similarly, when there is a sudden surge in your website traffic, it can harm the website’s functionalities and go beyond the limit of maximum server resources allocated for the website. This increases the load on the server and the site may fail to respond to a visitor’s request. Even though your site is optimized for excellent speed, the server may outburst, if it gets overloaded. The site will slow down and become non-responsive for visitor’s use, resulting in customer disappointments and affecting business revenue.

If you wish to take advantage of this high incoming traffic to your site and generate profit out of it then its time you migrate your website to the cloud. Cloud hosting immensely diminishes any chances of downtime in situation of server break down. It instantly allocates resources on-demand basis.

Downtimes are Intolerable

If the server goes down for any reason, your website becomes inaccessible for that particular span of time. Downtime affects your website’s SEO, sales & reputation and nobody wants that! All the businesses work hard to reach the aim to achieve zero percent of website downtime. But you cannot control the inevitable. There are several reasons that make the server unavailable such as server overloading, unavailability of resources, etc.

The solution to this problem lies in cloud hosting, as the resources are divided across various servers on the cloud network. In case one server fails, there is already another server to take hold of the website.

Cost Management

Your website requirements may expand anytime. You will opt for a server plan with more resources. What if most of the time you don’t need the actual resources that you are paying for? You are over paying in such cases as you need to pay fixed amount irrespective of your actual usage. This happens mostly with traditional hosting. For example, you book a hotel room, you pay per reservation. Right from the time of check-in to the moment you check-out, you get charged for the total duration the room was utilized regardless of how many facilities or how long you have actually used the room.

In cloud hosting environment, you don’t need to pay fixed cost. The cloud hosting works on pay per use model. You pay only for the resources that you utilize. You can compare this model to your electricity usage. The similarity between both the technologies is that they provide you resources on-demand basis and you need to pay only for the amount of resources you have used.

Future of Cloud Technology

  • The fastest-growing technology in the market on today’s date is cloud technology. According to Gartner, Inc. public cloud service market is expected to grow from $153.5 billion to $186.4 billion in 2018.
  • The SaaS market is expected to reach $73.6 billion in 2018.
  • 83% of enterprise workloads will operate on public cloud platforms by 2020.

Other Trends

Multi cloud strategy

  • According to Rightscale survey, multi cloud is preferred strategy among enterprises.
  • 81% of enterprises use multi cloud strategy.

Industry-specific cloud computing

  • For fulfilling the unique requirements, industry-specific cloud will become standard.
  • User base will become more diverse.

Hybrid Cloud

  • Enterprises will prefer hybrid cloud over public cloud server, as predicted by Nasscom Community.
  • This can result in launching of API platforms by cloud providers.

Summary

Cloud technology is growing even faster than expected. It has come a long way over the last few years. We are seeing more and more businesses are being shifted to the cloud as it is helping to meet business challenges. Several enterprises prefer using cloud hosting as they are aware about the advantages of cloud hosting.

About Guest Author – Disha Kajale

Currently working as, digital marketing executive & social media associate at MilesWeb. Her responsibilities include creating high quality content for blogs, articles, social media and webpage content at MilesWeb. In her free time, you will see her doing research on various social media platforms for audience engagement and marketing strategies.

Categories
Cloud Cloud News Datacenter News

Public cloud services revenue in India will reach $2.5 billion in 2018: Gartner

The revenue of public cloud services in India is expected to reach $2.5 billion this year, up 37.5% from a year before, according to Gartner. In 2017, the public cloud revenue in India was $1.8 billion.

“While the public cloud revenue market in India exhibits solid growth in 2018, the growth rate is expected to flatten, which is indicative of a maturing market,” said Sid Nag, research director at Gartner.

Public cloud is divided into five segments: Business Process as a Service (BPaaS), Platform as a Service (PaaS), Software as a Service (SaaS), Cloud Management and Security Services, and Infrastructure as a Service (IaaS).

The revenue growth of public cloud services will majorly be driven by the IaaS segment, which is expected to reach $1 billion in 2018, up 46% from 2017. The growth in IaaS segment is being driven by enterprises refraining from using data center build-outs and consolidation among data center vendors.

“While IaaS enables efficiencies and cost benefits, organizations need to be cautious about IaaS providers potentially gaining unchecked influence over customers and the market,” said Mr. Nag. “In response to multicloud adoption trends, organizations in India are also increasingly demanding a simpler way to move workloads, applications and data, across cloud providers’ IaaS offerings without penalties.”

SaaS, the largest segment of Indian public cloud market in 2017, is predicted to reach $932 million this year, up 34% from last year. Gartner said that organizations would continue to move toward applications and workloads to cloud locally, as opposed to running them on-premises. The demand for purpose-built services to deliver specific business outcomes is rapidly increasing.

Whereas, the PaaS segment will reach $191 million in 2018, up from $143 million in 2017. Within this segment, particularly the database PaaS (dbPaaS) is forecasted to total $32 million this year, an increase of 50% from a year before.

Gartner said that the growth in dbPaaS segment presents an opportunity for hyperscale cloud providers to include it in their services to increase customers.

public cloud services revenue in India
Image Source: Gartner

Also read: Spending on data center infrastructure in India will reach $2.7 billion in 2018: Gartner

Gartner will share additional analysis on data center and IT operations trends at Gartner global IT Infrastructure & Operations events.

Categories
Cloud Cloud News News

Worldwide public cloud market to hit $186.4 billion, with hyperscale cloud providers dominating it: Gartner

According to a new analysis by Gartner, public cloud services market is projected to grow from $153.5 billion in 2017 to $186.4 billion in 2018, which is a rise of 21.4 percent.

Amongst the cloud segments, IaaS (Infrastructure-as-a-service) was identified as the fastest growing segment, predicted to grow 35.9 percent in 2018, reaching $40.8 billion, led by leading IaaS providers like Amazon Web Services (AWS) and Microsoft Azure.

Source: Gartner

SaaS (Software-as-a-service) was again identified as one of the largest segments of the cloud market with a revenue growth expected of 22.2 percent, to hit $73.6 billion in 2018. Gartner also predicted that by 2021, SaaS will reach 45 percent of the total application software spending.

SaaS based application models are becoming a preferred choice for most of the enterprises. Sid Nag, who is a research director at Gartner, thinks that the SaaS demands are changing with users seeking more purpose-built solutions that can meet their specific business outcomes.

Under PaaS (Platform-as-a-Service) segment, dbPaaS (database platform as a service) is seeing the highest demand, expected to hit $10 billion by the year 2021. As a result, the hyperscale cloud providers are expanding their range of services to include dbPaaS.

Talking about the high demand of dbPaaS, Mr. Nag said that the customers should explore other dbPaaS service offers apart from the one offered by the large service providers, to avoid any lock-in.

Despite the high forecast rates, Gartner expects the growth rate to stabilize from 2018 onwards, due to the maturity that cloud services might gain within the IT segment.

One of the primary challenges here is to avoid vendor lock-in. With most of the big cloud providers like AWS, Microsoft etc. offering major cloud services, companies that once use any vendor’s cloud platform can find it very expensive and complicated to move away again.

Gartner said that this scenario might give rise to new demands by customers who want easy migration of their apps and data, without any penalties.

Categories
Cloud Cloud News Datacenter

95% of total datacenter traffic to come from cloud by 2021: Cisco report 

Global cloud traffic will nearly triple over the next five years, accounting for 95% of total datacenter traffic by 2021, as per Cisco GCI report.

Cisco Global Cloud Index (2016-2021), company’s seventh annual report, focuses on data center virtualization and cloud computing. The report reveals that both consumer and business applications are driving the cloud adoption.

The datacenter traffic is rapidly growing, and it global cloud IP traffic will touch 19.5 ZB (zettabytes) per year by 2021, up from just 6 ZB a year in 2016. On the other hand, Big data will reach 403 EB (exabytes) by 2021, growing 8-times from 25 EB in 2016. It alone will represent 30% of the overall datacenter traffic.

With rapid rise in demands for datacenter and cloud resources, the large-scale public cloud datacenters called hyperscale datacenters have been developed. Hyperscale datacenter count in 2016 was 338, which will grow to 628 by 2021, representing 53% of all installed datacenter servers.  

“Data center application growth is clearly exploding in this new multicloud world. This projected growth will require new innovations especially in the areas of public, private and hybrid clouds,” said Kip Compton, Vice President of Cisco’s Cloud Platform and Solutions Group.

According to the study, 94% of workloads and compute instances will be on cloud data center, while only 6% by traditional datacenters by 2021.

Of the total cloud workloads and compute instances, SaaS will comprise 75%, followed by IaaS (16%) and PaaS (9%) in 2021.  

The improvements in data control and datacenter governance have reduced the security issues, which are major barriers to cloud adoption. Additionally, the advanced technologies like internet of things (IoT) and artificial intelligence (AI) will also increase datacenter demands.

The IoT applications like smart cars, smart cities, connected health and digital utilities need storage solutions and scalable computing. The IoT connections are expected to reach 13.7 billion by 2021, up from 5.8 billion in 2016. The data created by IoT devices will grow from 218 ZB per year in 2016, to 847 ZB per year in 2021.

Also read: Cisco unveils its own container platform for multicloud environments

Cisco GCI 2016–2021 report concludes that along with the growth in datacenter traffic, the datacenters are also streamlining with architectural innovations like NFV and SDN. The cloud traffic will more than triple over the forecast period, where the major traffic will be enabled by rapid extension of datacenter virtualization.

Categories
Cloud Cloud News

Microsoft to use Cisco Solution Support for better network connectivity with Azure ExpressRoute

Microsoft has teamed up with Cisco to provide customers more secure network connectivity to Microsoft Azure cloud platform with Azure ExpressRoute.

Azure ExpressRoute helps the users to establish a private and direct connection to Microsoft cloud services, like Microsoft Azure, Office 365, and Dynamics 365. It also enables them to extend their on-premises networks into Microsoft cloud, which helps in managing and running the business-critical applications and services.

The enterprises that move to cloud from a traditional IT model face a number of cloud challenges, like increased complexity, loss of speed and data integrity, limited connectivity and management hassles, among others.

To overcome these challenges, Cisco will now provide its Solution Support for Azure ExpressRoute, to build a new network practice which provides fast, reliable, and predictable private connectivity.

“To help address on-premises issues, which often require deep technical networking expertise, we continue to partner closely with Cisco to provide a better customer networking experience. Working together, we can solve the most challenging networking issues encountered by enterprise customers using Azure ExpressRoute,” wrote Yousef Khalidi CVP, Azure Networking, in a blog post.

The Cisco Solution Support offers additional support and guidance options for Azure ExpressRoute, helping the customers on premises end of the network. The customers will also have support from Cisco solution experts to quickly resolve their issues and connect to Microsoft Cloud Platform.

“With our customers in mind, Cisco is extending our Solution Support portfolio with a new network practice and offer for Azure ExpressRoute. This new offer for networking targeting the customers on premises network, allows us to leverage our world class networking expertise to assist customers using Cisco networking products and Microsoft Azure ExpressRoute to connect to the Microsoft Azure Cloud Platform,” wrote Joe Pinto, Senior VP, Cisco’s Technical Services Group, in a separate blog post.

Furthermore, Microsoft has integrated Network Performance Monitor (NPM) into ExpressRoute, which will enable customers to monitor connectivity to PaaS services (Azure Storage), as well as SaaS services (Office 365). This will provide more deep visibility into ExpressRoute network traffic. It will be generally available in mid-February in six regions.

Also read: Microsoft adds new monitoring and troubleshooting services to Azure Site Recovery

Additionally, Microsoft has merged public and Microsoft peering for simplified management and configuration of ExpressRoute. The ExpressRoute configuration needed customers to have ExpressRoute circuits in two different cities. Microsoft is planning to provide the second ExpressRoute site in the cities which already have an ExpressRoute site. As of now, the second peering location is available in Singapore only.

Categories
Cloud Datacenter News

HPE simplifies multi-cloud management with OneSphere

At its Discover 2017 customer conference in Madrid, HPE introduced a simplified multi-cloud management platform called OneSphere that provides a combined experience across public clouds, on-premises private clouds and software-defined infrastructure.

HPE has designed OneSphere to address the needs of developers, IT operators, data scientists, researchers and enterprises to build clouds, deploy apps, and gain insights faster.

“Our customers need a radically new approach – one that’s designed for the new hybrid IT reality,” stated Ric Lewis, senior vice president and general manager, Software-Defined and Cloud Group at HPE. “With HPE OneSphere, we’re abstracting away the complexity of managing multi-cloud environments and applications so our customers can focus on what’s important – accelerating digital transformation and driving business outcomes.”

OneSphere contains a SaaS portal through which it offers access to a set of IT resources including public cloud services and on-premises environments. It also offers unified experience across clouds, sites, orchestration tools, PaaS and containers, which results in minimizing the requirement of specialized skills.

HPE said that managing the multi-cloud environments with traditional management solutions is complicated, and needs multiple points of management, consuming more resources and costs. They are also difficult to set up and manage since most companies use a combination of public cloud and on-premises resources.

HPE OneSphere has been designed to simplify all these management complications, providing users a one-stop access to all their applications and data from anywhere.

It works across containerized workloads, bare metal applications, and VMs, enabling internal stakeholders to compose hybrid clouds.

OneSphere streamlines DevOps so that enterprises get deep insights across the public and on-premises environments. It enables them to speed up the cycle times, better productivity, and generate cost-savings.

Also read: HPE’s new high-density compute and storage solutions to help businesses adopt HPC and AI applications

HPE said that OneSphere is a solution to accelerate digital transformation, and is ideal for businesses that want to capitalize on digital disruption and enable a broad range of new customer experiences.

It will be available from January 2018.

Page 1 of 5
1 2 3 5