Categories
News Start-Ups Web Hosting

German domain registrar united-domains launches rankingCoach for clients 

  • united-domains presents with rankingCoach the effective solution for SMBs to be found on Google & Co through search engine optimization.
  • Based on individual tasks and step-by-step video tutorials, even inexperienced users can optimize their website by themselves.
  • With rankingCoach users can compare themselves with their competitors and develop an effective and sustainable strategy.

united-domains AG, a multiple award-winning hosting and domain service provider from Germany, chooses ​rankingCoach​ as SEO solution for its clients. The cost-effective search engine optimization promises more online presence, more visitors and therefore more sales for small and medium-sized companies.

“If you create a great website with a lot of passion, you should make sure that it is visible online”, ​says Thomas Meierkord, COO rankingCoach.

“Our vision is to give SMBs the opportunity to take their digital marketing into their own hands and compete with the top companies. With our application, united-domains customers can now optimize their websites and be found on Google and Co. Therefore, I am looking forward to a successful partnership with united-domains, who have proven their qualities in domain service for years.” 

Also read: Enhancing on-premise solutions market in India: ZNet becomes a distributor of Acronis

rankingCoach dashboardsWith rankingCoach, clients can improve their search engine placement even without experience and knowledge in online marketing. After analyzing the website, rankingCoach guides the client step by step through the tasks and offers more than 1200 tailored video tutorials. Users can optimize up to 20 keywords at up to 3 locations and track their performance daily with reports.

“With rankingCoach, we provide our clients with an easy-to-use and cost-effective tool for search engine optimization”, ​says Thomas Reimer, product manager at united-domains.

In addition to the website’s performance, users can also compare themselves with competitors online. rankingCoach provides simple charts that show the rankings of their own website in search engines compared to those of competitors.  United-domains becomes the 46th international partner that is reselling rankingCoach’s digital marketing products.

Read next: Cybersecurity market in India to reach $3 billion by 2022: DSCI-PwC Report
Categories
Newss

Microsoft Teams PowerShell module now up for grabs

Last year, Microsoft had released the Microsoft Teams PowerShell module in beta. The company has now brought it to general availability.

Microsoft Teams comes with a wide range of tools for IT admins which allows them to manage the product through the admin center in Teams, PowerShell controls, and Graph APIs.

The PowerShell module allows IT admins or Teams Service Admins to manage the lifecycle of teams within the organization. They can identify and manage teams on behalf of users, and make updates to teams faster, whether it is changing memberships or managing team settings.

Generally available version of Microsoft Teams PowerShell module will leverage only 1.0 Graph APIs.

The new PowerShell module will come with all the cmdlets required for creating and managing teams. Cmdlets are lightweight commands used in the PowerShell environment.

There are around 14 cmdlets in the Teams PowerShell module. For instance, admins can use Connect-Microsoft Teams cmdlet for connecting authenticated accounts to Teams environment, while use Disconnect-MicrosoftTeams to disconnect the account. Complete list of cmdlets with description is available here.

In the generally available version, Microsoft has made a number of improvements to the cmdlets. For example, admins can now specify a Teams Government Environment in which the organization is located.

Also read: Azure Data Studio gets support for PostgreSQL, SQL Notebooks and more extensions

“Going forward, we will be maintaining both a Preview and Generally Available versions of the Microsoft Teams PowerShell module. This will allow us to deliver new, preview functionality to our customers faster for testing, while ensuring that our Generally Available module continues to leverage only 1.0 APIs,” wrote Christopher Bryan, Product Marketing Manager – Microsoft Teams, in a blog post.

Categories
Cloud Newss

GitLab puts power of Kubernetes in developer workflow with extended integration

GitLab is extending its integration with Kubernetes, by bringing it into the developer workflow. It will make it easier for enterprises and developers to drive innovative software to production.

A single application for the DevOps lifecycle, GitLab comes with a built-in container registry and Kubernetes integration. This makes things simple while getting started with containers and cloud-native development for optimization of app development processes.

Using GitLab, all the developers in the organization can access Kubernetes, which helps in speeding up the process of software development.

“By allowing people to quickly connect Kubernetes clusters to their projects we are helping many enterprises embrace the cloud native way of building applications,” says Sid Sijbrandij, CEO at GitLab.

“By providing a single application we allow enterprise developer and operations teams to embrace Kubernetes every step of the way in their software development process. We’ve seen a large financial institution go from a single build every two weeks to over 1,000 self-served builds a day using GitLab. It is wonderful to see the scale we can unlock for organizations by providing access to Kubernetes in the developer workflow.”

Developers will be able to connect their existing Kubernetes cluster (on any platform) to GitLab. They can also easily set up and configure new clusters using Google Kubernetes Engine (GKE) integration. After setting up the clusters, they can install managed apps like Helm Tiller, Ingres, and Prometheus to their cluster.

Integration of GitLab and Kubernetes will provide a number of advanced capabilities, like Deploy Boards, Canary Deployments, Kubernetes monitoring, Auto DevOps, and Web Terminals.

Deploy Boards will provide a unified view of the current health and status of CI/CD environments that run on Kubernetes.

GitLab will provide automatic detection and monitoring of Kubernetes clusters via the Prometheus.

Also read: Top 5 collaboration tools for DevOps teams

Further, the Auto DevOps is a default setting for CI/CD that automatically generates pipelines without any configuration. Lastly, the Interactive Web Terminals will provide instant web access to a terminal for remote environments to make troubleshooting easier.

Categories
Cloud Cloud News

Equinix will bring its ECX Fabric capabilities to APAC this year

Equinix is planning to bring its Cloud Exchange Fabric (ECX Fabric) capabilities to Asia Pacific region this year.

The Equinix Cloud Exchange Fabric leverages software-defined networking (SDN) to enable multiple connections through one port. It is a platform which allows customers to connect with any other ECX Fabric customers in the region.

It helps customers to implement economic data replication and synchronization for business continuity, and enables inter-country connectivity to backup cloud services.

Its availability in APAC will help enterprises in the region to easily provision connections to other ECX Fabric facilities in Australia, Hong Kong, Japan, and Singapore to build highly available strategies in near real time.

ECX Fabric customers can also directly connect to key service providers including Alibaba Cloud, Amazon Web Services (AWS), Google Cloud, IBM Cloud, Microsoft Azure, Oracle Cloud, and SAP.

Equinix believes that interconnection with key business partners and customers is becoming an essential element of digital supply chain, as companies move to digital business models.

The interconnection removes the need of traversing public internet or WAN, and enables direct and private access to cloud service providers, SaaS providers, network service providers and more. The key advantages of interconnection are improved application performance, reduced latency, increased security, and improved network control.

The Equinix Cloud Exchange Fabric capabilities are currently available in all the EXC Fabric locations in Americas, and EMEA regions. It is expected to be available in APAC in their quarter of 2018.

Furthermore, Equinix recently partnered with Omantel to establish a joint venture in Barka, the capital of Oman. The joint venture will provide data center and interconnection services to carriers, content providers, and cloud providers in the Middle East.

Also read: Equinix closes acquisition of Australian datacenter provider Metronode

Image source: Equinix

Categories
Cloud Cloud News News

Corero discovers ‘Kill Switch’ to mitigate the Memcached DDoS attacks 

Researchers from Corero Network Security have discovered a practical ‘kill switch’ that will be able to mitigate the Memcached vulnerability, recently used in causing the record breaking DDoS attacks. They have disclosed the existence of this switch to national security agencies. 

Corero said that the potential of Memcached vulnerability is more extensive than reported originally, and the attacked servers can be used by hackers to steal or modify the data. This data can be database records, emails, API data, Hadoop information, website customer information, etc. 

Memcached, the open source memory caching system decreases data access time by storing it in RAM. Since access does not require authentication, it was originally designed to be inaccessible from the internet. The exploit allows attackers to generate fake requests and magnify the attacks creating traffic flood. Currently more than 95,000 servers answer on UDP port 11211, which means all of them are vulnerable to the DDoS attacks.  

“Memcached represents a new chapter in DDoS attack executions. Previously, the most recent record-breaking attacks were being orchestrated from relatively low bandwidth Internet of Things (IoT) devices. In contrast, these Memcached servers are typically connected to higher bandwidth networks and, as a result of high amplification factors, are delivering data avalanches to crippling effect. Unless operators of Memcached servers take action, these attacks will continue,” explained Ashley Stephenson, CEO at Corero Network Security.  

The kill switch discovered by Corero can be an effective solution to mitigate the attacks. The security firm claimed that it tested the kill switch on live attacking servers and found it 100% effective. “It has not been observed to cause any collateral damage.” 

The kill switch sends a ‘flush all’ command to the attacking server which suppresses the DDoS exploitation. The command invalidates the malicious payloads by clearing the cache of vulnerable server.   

When GitHub was attacked by DDoS attack last week, the issue was reported to the National Vulnerability Database (NVD). NVD found that the Memcached version 1.5.5 contained an insufficient control of network message volume vulnerability in UDP support of memcached server. This issue has been fixed in version 1.5.6.  

The memcached servers need to be updated to latest version, to disable the UDP protocol by default.  

Categories
Cloud Cloud News Datacenter News

Microsoft introduces services for moving SQL Server and applications to Azure to help customers get better ROI

Microsoft announced a number of new services and updates to its Azure cloud platform, aimed at making it easier for organizations to migrate mission critical database workloads to its cloud data centers.

The new services include SQL Database Managed Instance, Azure Hybrid Benefit for SQL Server, Azure Database for MySQL and PostgreSQL, Azure Database Migration Service and Azure Migrate.

The SQL Database Managed Instance is a managed version of SQL Server running in the cloud, which enables migration of SQL Server workloads to a completely managed database service. Currently available for preview, the new service is fully compatible with SQL Server engine and native virtual network (VNET).

To streamline databases migration to SQL Database Managed Instance, Microsoft also announced expanding Azure Database Migration Service (Azure DMS) for supporting SQL Database Managed Instance. DMS is a comprehensive solution for moving on-premises SQL Server to Azure SQL Database for complete database support.

“Managed Instance offers full engine compatibility with existing SQL Server deployments including capabilities like SQLAgent, DBMail, and Change Data Capture, to name a few, and built-on the highly productive SQL Database service,” wrote Rohan Kumar, Corporate Vice President, Azure Data, in a blog post.

Microsoft expanded its Azure Hybrid Benefit Program with support for SQL Server. Customers will now be able to move on-premises SQL Server licenses to Managed Instance. This program will help customers save up to 30% on SQL Database Managed Instance bills when migrating an existing license to Azure.

Azure database services for MySQL and PostgreSQL, which were available for preview since last year, are now generally available. These services will especially be helpful for those organizations who want to leverage benefits of open source software, without dealing with the hassles that come with scaling and patching databases.

“Azure database services for MySQL and PostgreSQL offer the community versions to ensure simplest migration and fastest path to development with these services on Azure,” added Kumar.

Also read: Microsoft Azure services to help government customers with digital transformation

Microsoft also announced general availability of Azure Migrate, the software for cloud migration planning. The Azure Migrate enables deep insight for cloud migration projects so that customers can know about right-sized resources, cost estimates, as well as guidance on workload readiness.

Categories
Cloud Cloud News Datacenter News

Acronis bolsters its leadership position in cloud data protection with Google Cloud partnership

Acronis, a leading provider of hybrid cloud data protection and storage across the globe, announced a new strategic alliance with Google Cloud, to enhance Acronis backup and disaster recovery solutions with integration of Google Cloud Platform.

The partnership will help Acronis to expand the number of cloud regions where customers and partners can back up their data, offering faster upload speeds and more options to control data in increased number of countries.

The future releases of Acronis’ backup, disaster recovery, and file sync and share solutions will have Google Cloud Platform integrated in them. The integration will enhance Acronis’ solutions and provide its customers easy access to Google Cloud Platform.  They will be able to reduce their Recovery Time Objectives (RTO) and achieve Service Level Agreement (SLA) targets in DR situations.

 “We are seeing an increasing number of customers take advantage of our storage products for data protection because they offer better flexibility, performance and economics,” said Adam Massey, Director, Strategic Technology Partners, Google Cloud. “By partnering with Acronis, an established leader in the industry, we hope to make the customers data protection journey even easier.”

Acronis will also extend support for Google cloud workloads, enabling use of Google Cloud Platform for disaster recovery, and providing Google Cloud deployment model for Acronis Backup management server.

The partnership will also help the partners of Acronis with new opportunities and flexibility to address the data protection demands of customers across cloud, datacenter, remote and mobile devices. They will be able to offer easy and fast data protection services to global customers who already use Google Cloud Platform.

“The integration of Google Cloud with Acronis backup products is one of the top requests from our users,” said John Zanni, Acronis President. “There are many businesses who have invested in public cloud infrastructure. We’re making it easy for them to use it with Acronis.”

In January this year, Acronis had partnered with Plesk to integrate Acronis Backup Cloud into Plesk’s website management platform, enabling hosting and cloud service providers to easily back up Plesk servers.

Categories
Cloud Cloud News News

Kubernetes becomes the first ever project to graduate from CNCF

Kubernetes has become the first member of CNCF (Cloud Native Computing Foundation) to graduate from an incubating status, signaling a strong commitment to code quality and security best practices.

When CNCF was founded in 2015, Kubernetes was its inaugural project. Today, the Kubernetes is contributed to by more than 11,000 developers and has over 75,000 commits on GitHub. Among the 1.5 million projects on GitHub, Kubernetes ranks second for authors, ninth for commitments, and is third in the top 30 velocity open source projects.

Top cloud providers today offer their own Kubernetes services, and it is used by renown organizations including Uber, Bloomberg, The New York Times, Lyft, eBay, Blackrock, etc. As reported by Redmonk, of the 71% Fortune 100 companies that use containers, 50% use Kubernetes as their container orchestration platform.

These things show how mature and resilient a project Kubernetes has become in just over two years. A voting process is held for a project to enter as an inception, incubating, or a graduated project. A project is accepted when two-thirds of the CNCF Technical Oversight Committee (TOC) votes for it.

For a project to graduate, it has to earn and maintain a Core Infrastructure Initiative Best Practices Badge, have committers from at least two organizations, adopt the CNCF Code of Conduct, define a project governance and committer process, receive supermajority of votes from TOC.

Kubernetes earned the CII Best Practices Badge in 2016, and successfully fulfilled all the graduation criteria. The TOC voted for Kubernetes at Open Source Leadership Summit to become the first ever CNCF project to graduate.

The graduation status show that Kubernetes can manage container at scale across any industry in companies of all sizes. “As a graduate, Kubernetes is in an even stronger position to grow faster and sustain a vibrant, healthy and diverse technical community.”

Also read: HPE and Portworx join hands to launch a new solution using Kubernetes for stateful container deployment

“We would like to congratulate the Kubernetes project community that has worked with us sometimes as students, frequently as peers, and often as teachers. Kubernetes would not be where it is today without the attention and devotion so many have given it across the industry,” said Sarah Novotny, Google Cloud’s open source strategy lead. 

Categories
Cloud News News

Monstrous DDoS attacks: two record breaking attacks detected within a week

With the cyberworld facing the most disturbing security threats quite early on, 2018 is predicted to be an eventful year on the cyberthreat front. GitHub faced the largest ever DDoS (Distributed Denial of Service) attack last week, which peaked at 1.3 terabits per second, or 126.9 million packets per second.

And the record of the largest DDoS attack got broken just within a week, when earlier this month, a customer of US based service provider suffered a 1.7 Tbps attack, as reported by the Arbor Networks. Although no outage was experienced as proper security measures were in place by this provider but the attack is proof enough that memcached attacks are among the cyberthreats that should be considered seriously by the network administrators in the future.

These DDoS attacks were based on UDP (User Datagram Protocol) Memcached traffic. Memcached is a protocol used to cache data and decrease strain on heavy data stores such as disk or databases. It enables the server to be enquired about key value stores, intended to be used on systems which aren’t exposed on public internet.

However, the attackers spoof IP addresses of UDP traffic, and send the request to a vulnerable UDP server. The server prepares the responses, not knowing the request is fake. Hence, the information is delivered to an unsuspecting host, which causes the attack.

When GitHub was attacked last week, its servers stopped responding for a few hours, until Akamai filtered out the malicious traffic from UDP port 11211 (the default port used by memcached). Akamai warned that because of memcached reflection capabilities, the same attack might soon occur again with higher data rate.

“Many other organizations have experienced similar reflection attacks since Monday, and we predict many more, potentially larger attacks in the near future. Akamai has seen a marked increase in scanning for open memcached servers since the initial disclosure,” stated Akamai in a blog post. “Because of its ability to create such massive attacks, it is likely that attackers will adopt memcached reflection as a favorite tool rapidly.”

Arbor Networks confirmed the second DDoS attack and its data rate, and mitigated it using ATLAS global traffic and DDoS threat data system.

“While the internet community is coming together to shut down access to the many open memcached servers out there, the sheer number of servers running memcached openly will make this a lasting vulnerability that attackers will exploit,” wrote Carlos Morales, VP of sales, engineering and operations at Arbor Networks in a blog post.

These attacks can be mitigated by blocking off UDP traffic from Port 11211, and locking down the systems to avoid being part of such attacks.

Prior to this, the biggest DDoS attack was detected in September 2016 in Brazil, which peaked 650 gigabits per second. The memcached DDoS attacks are the first ones to cross terabit limit.

Also read: Comodo Threat Research Lab uncovers new trick used by hackers to attack enterprises  

Categories
Cloud Cloud News News

Microsoft Azure services to help government customers with digital transformation

Microsoft announced that the integration of Azure Stack with Azure Government cloud will be possible by mid – 2018, enabling government clients to run the cloud technology in their own datacenters.

Azure Stack is an extension of Microsoft’s hybrid cloud approach that brings agility of public cloud to on-premises environments. It helps organizations to meet the requirements related to regulations, connectivity and latency, and enables customers to seamlessly move between public cloud environments and their own infrastructure.

Government agencies use hybrid cloud as the foundation of their IT modernization strategy. The integration of Azure Stack with Azure Government cloud will enable government customers to leverage benefits of hybrid cloud, and bring basic as well as advanced cloud solutions to the tactical edge which can be offline, connected, or disconnected, such as tank, aircraft, field office, etc.

Azure Stack makes it possible for government agencies to process data in the field without worrying about internet connectivity and latency. Further, the analytics in Azure Government can be used to get detailed and accurate predictions.

“Government customers can now modernize their on-premises legacy applications that are not ready for the public cloud due to cyber defense or any other requirements. Azure Stack brings a core set of Azure services – containers, Web apps, Serverless computing –  and microservices architecture on-premises to update and extend legacy applications,” wrote Natalia Mackevicius, Director, Azure Stack in a blog post.

The integrated services will enable connections to Azure Government identity, subscription, registration, billing, as well as backup and disaster recovery. These can help federal cabinet agencies, government entities like military bases or embassies in foreign countries, Army, Navy, Air Force and Marine Corps.

Quite literally we’ve designed Azure Stack with the scenario of a submarine in mind,” Tom Keane, Microsoft Azure’s head of global infrastructure, told Reuters.

At the Microsoft Government Tech Summit in Washington D.C., Microsoft announced expansion of Azure Government’s support with two new Azure Government Secret regions for data classified as Secret.

Also read: Microsoft empowers developers with updated Quantum Development Kit

Furthermore, Microsoft added new PaaS services, Azure Site Recovery, Backup, and Azure App Service to its DoD (Department of Defense) Impact Level 5 Provisional Authorization.

Page 1 of 138
1 2 3 138