Categories
Newss Technology

Microsoft Defender ATP gets new Threat & Vulnerability Management functionality

Microsoft has announced the general availability of Threat & Vulnerability Management solution. The tech giant says the solution was made available on June 30 and provides real-time vulnerability management to the organizations.

The new solution is a built-in capability in Microsoft Defender ATP (Advanced Threat Protection). The Microsoft Defender ATP uses a risk-based approach to discover, prioritize, and mitigate endpoint vulnerabilities and misconfigurations.

While designing the solution, Microsoft worked with a dozen of enterprise customers including Telit, a global leader in IoT enablement, to understand the limitations and complications of the existing processes.

During the process, Microsoft found that the current approaches being used to scan vulnerabilities are slow and periodical. This could lead to security blind spots between scans, flooding the organizations with vulnerabilities. Further, manual mitigation of vulnerabilities sometimes takes days, weeks, or months to complete. Hence, attackers have a window to attack the enterprises.

Threat & Vulnerability Management solution is aimed to address these issues.

“Our goal is to empower defenders with the tools they need to better protect against evolving threats, and we believe this solution will help provide that additional visibility and agility they need,” wrote Rob Lefferts, Corporate Vice President at Microsoft Security, in a blog post.

The new offering will provide several benefits to customers, like enabling continuous discovery of vulnerabilities and misconfigurations, and business-context based prioritization and dynamic threat landscape. It will provide correlation of vulnerabilities with endpoint detection and response (EDR) alerts to expose breach insights.

Customers will also get machine-level vulnerability context during incident investigations, built-in remediation processes through integration with Microsoft Intune and Microsoft System Center Configuration Manager.

The new TVM solution is now generally available. Existing customers can find it in Microsoft Defender ATP portal. New customers can take a free trail here.

READ NEXT: Microsoft empowers Indian startups to scale globally with new initiative and venture fund expansion

Categories
Newss Technology

Usage of Linux on Azure surpasses Windows Server

The usage of Linux on Azure has exponentially surpassed the Windows, confirmed Microsoft Linux Kernel developer, Sasha Levin, to ZDNet.

The battle between Windows and Linux has been going on for over a decade. While Windows became a clear dominant OS on desktops, the Linux has won the battle on server.

In 2016, Azure CTO Mark Russinovich had revealed that 25% of the Azure instances were Linux, which increased to 40% the next year. Then in 2018, Microsoft told ZDNet that around 50% of Azure VMs were Linux.

This shows that Linux hasn’t won the battle overnight. More and more enterprises are choosing Linux over Windows when it comes to server.

“Every month, Linux goes up,” Scott Guthrie, Executive VP of the cloud and enterprise group, Microsoft told ZDNet in September last year.

Microsoft users have been actively choosing Linux and open-source software for over 10 years, since Microsoft open-sourced ASP.NET. “We recognized open source is something that every developer can benefit from. It’s not nice, it’s essential. It’s not just code, it’s community,” said Guthrie. “We’re now the largest open-source project supporter in the world.”

Now, there are almost a dozen of Linux distros available on Azure, that too without considering the Microsoft’s own Azure Sphere.

READ NEXT: Microsoft to stop support for Windows 7 from January 2020

Categories
News

NEC Server Software Enables Advanced and Secure Login to Websites in Compliance with FIDO2

NEC Corporation announced today the availability of its enhanced NC7000-3A server software, which will enable simple, secure and swift authentication of users for access to websites and mobile applications through biometric authentication.

In addition, NEC is also releasing SDK-based voice authentication that accurately identifies users by extracting the unique characteristics of their voices when they speak predetermined phrases. The FIDO2(1)-compliant server software and updated SDKs are scheduled to be available in July and August 2019 respectively.

NC7000-3A integrates with business/service provider user profiles and manages authentication activities for web services. This software is a FIDO-certified product that enables users to be authenticated without sending biometric information or any other personal information outside of a terminal, thereby reducing the risk of compromising biometric identities and passwords.

Following this update, NC7000-3A server software is now certified with the FIDO2 standards established by the FIDO Alliance(2), which promotes international standards for “password-less” online user verification.

Existing NC7000-3A server software is certified with FIDO UAF, which allows users to login with biometric authentication when using mobile applications, such as online banking. This latest update also supports FIDO2, which enables users of PCs and smartphones to use biometric authentication when logging in to websites as well. FIDO2 capability enables login using external authentication devices, such as security keys, through USB/NFC/Bluetooth communication standards.

In addition, SDK that support a variety of authentication options, including fingerprint, face and voice recognition, are available for Android OS and iOS, enabling customers to freely select and combine multimodal authentication.

This server software and SDK will improve the convenience of logging in and prevent spoofing, which will contribute to the security of web services that require identity authentication. Specifically, it will enable password-less authentication for e-commerce, digital banking, and web services provided by municipalities and government agencies.

Under NEC’s “Mid-term Management Plan 2020,” the company is actively promoting services in new fields that leverage network strengths. Through this software, NEC is flexibly leveraging its networks to accelerate the provision of NEC Smart Connectivity(3), which links data generated by people and industry to create new social value.

“The NC7000 series is at the core of the NEC Smart Connectivity program and has a solid record of installations for financial institutions and telecommunications carriers,” said Takashi Sato, General Manager, Digital Services Solution Division, NEC Corporation. “This enhancement strengthens the role of Bio-IDiom(4), NEC’s portfolio of biometric solutions, in the provision of highly secure and convenient user certification, which supports the realization of a society where people, goods and services are reliably linked.”

Andrew Shikiar, executive director and chief marketing officer, FIDO Alliance, added: “NEC’s consistent efforts as a FIDO Alliance sponsor member help to promote the evolution and globalization of simpler, stronger FIDO Authentication. We are pleased to see NEC introduce its FIDO2 Certified server today as part of the strong and continuously growing ecosystem aimed to reduce the world’s reliance on passwords.”

“I am very pleased to see NEC obtain FIDO2 certification and to reinforce its standing as a member of the FIDO Alliance, whose goal is to supplant reliance on passwords,” said Koichi Moriyama, a Member of the FIDO Alliance Executive Council, Chairman of the Japan Working Group, Senior Director of Product Innovation, Product Department, NTT DOCOMO, INC. “As one of Japan’s leading ICT companies, we look forward to working together to accelerate efforts to create a world without passwords through deployment of FIDO certified products.”

References:
(1) https://fidoalliance.org/fido2/
(2) https://www.fidoalliance.org
(3) This is a collective term for network services that leverage NEC’s expertise and track record in network technologies and related solutions. We will utilize 5G and LPWA to create new data distribution that connects previously untapped data in various fields, such as social infrastructures, manufacturing, and retail, and delivers it to the people and goods they need.
(4) “Bio-IDiom” is NEC’s portfolio of biometric identification solutions, including face, iris, fingerprint, palm print, finger vein, voice, and ear acoustic solutions.
https://www.nec.com/en/global/solutions/biometrics/index.html

READ NEXT: Defend your business from modern-day cyber attacks with these 3 tips

Categories
News

GIGA Data Centers Officially Opens Colocation Facility in Mooresville, North Carolina Offering Capacity up to 60MW

GIGA Data Centers, LLC (GIGA), a new breed of data center provider creating affordable, hyper-scale power facilities with unprecedented energy efficiency, announced the company officially opened its CLT-1 Data Center in Mooresville, NC with a grand opening event that includes facility tours, a ribbon cutting ceremony and remarks from local officials.

The new facility leverages a highly energy-efficient modular design first used in turnkey installations provided to enterprise companies and agencies including the U.S. Department of Energy. The data center is connected to the High-Reliability transmission lines fed by two Duke Energy power generation plants supplying green and renewable electricity. In addition, the Mooresville facility represents the latest in data center innovation to cost-effectively support the higher power requirements needed for high-performance computing in AI, financial services, healthcare and many other industry segments where this elite level of compute service has previously been out of reach for the non-enterprise company.

At the event, GIGA President & CEO Jake Ring said: “With our new facility open for business, small to mid-sized companies will finally have access to high-performance compute capabilities at affordable prices.” Mr. Ring attributes the affordable pricing to GIGA’s WindChill® system which provides hot/cold aisle isolation and adiabatic cooling to dramatically lower data center operating costs yet offers flexible rack-power densities from 5 kilowatts up to 50 kilowatts per rack cabinet, all at 4.1₵ per kWhr.

CLT-1 is a 165,800 square foot data center with 120,000 square feet occupying the main data hall. The Tier-3 compliant facility is a complete departure from traditional raised-floor construction and supports power and cooling from 5kW to 50kW per 52U rack, at a guaranteed Power Use Efficiency (PUE) rating of < 1.15. In addition, customers benefit from an exemption of all sales and use tax until 2029, including on electricity. Also, property tax is rebated up to 80 percent until 2026.

“GIGA’s new facility is an efficiency milestone for the colocation industry that places world-class hosting and carrier-neutral services in an ideal location, to offer a lower-priced option over more costly data center fees charged in other markets,” Ring added.

Categories
Infographics

Google: The story behind Internet’s Giant

Today Google is the most respected and influential name in the digital world. It wouldn’t be an exaggeration to say that it is the unchallenged ruler of the internet kingdom. However, like any other brand, Google too has an interesting story behind its formation.

It started out as a research project

During 1996, two university students started a research project. At that time the duo, i.e., Larry Page and Sergey Brin, didn’t know that they have taken the baby steps towards building the digital world’s largest company. Page was considering exploring the WWW’s mathematical properties looking at the things with unique perspective. Larry focused on the connection between number/nature of backlinks and the specific web page to which it was associated. Soon Brin joined Page in the research project.

Modest functioning and meagre resources

Google started out on a very humble note and during its initial stage, it could not process more than 50 pages. For storing their limited data just 10 hard drives of 4 GB were sufficient that is just a fraction of the resources Google uses today for a massive 100 million GB! However, it seems that Larry and Sergey already had something great in mind as they used the Lego Design for supporting the future scalability.

A massive leap to the success

Due to the innovative concept and the constant efforts, the company progressed by leaps and bounds. Sergey and Page definitely didn’t anticipate the massive potential of the company but it seems that they had some inkling. That’s why they approached the digital Giant of that time- Yahoo! for selling Google. However, Yahoo didn’t seem to be convinced about the business viability of this young project that is why it declined the offer. It took them some time to realize the Google’s worth and they again called the company expressing their interest to buy it. It seems that they had “really” understood the worth of the company and were ready to pay a massive amount of $3 Billion! Was the amount really massive, for Google! Perhaps no! That’s why Google refused the offer.

The following infographic will take you on a quick tour around Google’s journey towards becoming an Internet giant. Take a look.

About Guest Author:

Mark Andrew is a web developer with varied interests- travel, wildlife, history, art, and of course technology! He is a keen supporter of free internet and when not in the cabin, he loves to spend his time outdoors exploring the world, people and technology.

Categories
Cloud Cloud News New Products News Technology

Google focuses on building an AI-first world with its new Cloud TPUs

Artificial Intelligence is continuously leading the way among the span of technologies in bringing out new software designs and capabilities. Google’s new AI chip is the latest one to get added to this.

At company’s annual developer conference held recently, CEO Sundar Pichai announced a new AI chip that can be a technological break-through in the world of machine learning and automation.

With this, Google signaled the increasing role of AI in software and hardware development.

The chips known as Cloud Tensor Processing unit (TPU) are designed to speed-up the machine learning operations and are also an upgrade from the first generation of chips Google announced at its last year’s I/O meet.

The second-generation chips can deliver nearly 180 teraflops of performance which is much higher as compared to its first-generation which could only handle inferences. The new chips can be used to train machine learning models. Training is an important part as it would help the machine learning program to identify and differentiate between images, data or other things.

Pichai also went forward announcing the creation of machine-learning supercomputers which will be based on Cloud TPUs to be used along with high-speed data connections. He said, “We are building what we think of as AI-first data centers Cloud TPUs are optimized for both training and inference. This lays the foundation for significant progress (in AI).”

Google emphasized on the application of machine learning and AI models to bring better work and performance oriented results. Google.ai is also the result of collective efforts towards building an AI first world.

However, Google did not clearly say anything about introducing the chip in the market, rather said that it will let companies rent access to the chip via its (Google’s) cloud computing service.

Categories
Innovation News Technology

Microsoft shifts it focus from Mobile-first Cloud-first world towards the Intelligent Cloud

Microsoft at its build 2017 conference revealed its vision of the future workplace that will run with the power of Artificial Intelligence, IoT and robotics.

Microsoft’s keynotes focused on server less and edge computing models that will bring new capabilities in almost every industry from healthcare to manufacturing.

CEO Satya Nadella spearheaded several announcements keeping in mind the developer community. He also went ahead advising developers to make good use of technology to help society advance.

He said, “We should empower people with technology – inclusive design can be an instrument for inclusive growth.” He also added, warning the developers to be more responsible, “It is up to us to take responsibility for the algorithms we create.

He considers IoT as the main data driver that needs to be analytically used for extracting maximum benefits.

The platform shift is all about data. When you have, an autonomous car generating 100GB of data at the edge, the AI will need to be more distributed. People will do training in the cloud and deploy on the edge – you need a new set of abstractions to span both the edge and the cloud,” he said.

Microsoft always emphasized on a cloud-first, mobile-first world, but with this event, it shifts its focus towards cognitive solutions and AI. Developers will get access to four new cognitive services apart from the 25 existing ones, of which three will be user customizable.

We are moving from mobile first, cloud first to a world made from an intelligent cloud and an intelligent edge.” said Nadella.

Thus, the move towards intelligent cloud will be the new mantra at Microsoft.

The session also talked about the use of edge computing with Microsoft unveiling the Azure IoT Edge which is a Windows and Linux cross-platform and can run on devices smaller than Raspberry PI.

Nadella talked about the how AI could be used to identify objects and people and bring more automation in the future workplace. Their demo involving a heart patient walking around with sensors attached signified the level of AI application in future. The sensors were capable enough to send notification to the nurse if the patient felt uneasy at any point of time.

Though the keynotes seem to be promising, but their application would certainly include a lot of responsibility on the part of developers.

Categories
Cloud News Technology

Office 365 plays pivotal role in Microsoft’s growth and presents opportunities for MSPs

As per 451 Research’s Hosting and Cloud Study 2017, there has been an increase in the number of hosted application services used by the organizations. Last year, nearly 55% organizations used email, collaboration and productivity apps for increasing business productivity. With digital transformation, hosted application services like file storage, database and warehousing, CRM and others are witnessing an increased usage.

The need for collaborative communication and emailing solutions like Office 365 is also rising. As per the data released by the Microsoft, there are over 70M plus Office 365 commercial active users. Customers want more than just an emailing solution and thus, Office 365 is witnessing a rapid rise and is playing a pivotal role in Microsoft’s growth.

As per Microsoft news on its third quarter results, its revenue in productivity and Business Processes was $8.0 billion that increased 22% (up 23% in constant currency), with the following business highlights:

  • Office commercial products and cloud services revenue increased 7% (up 8% in constant currency) driven by Office 365 commercial revenue growth of 45% (up 45% in constant currency).
  • Office consumer products and cloud services revenue increased 15% (up 14% in constant currency) and Office 365 consumer subscribers increased to 26.2 million.

Our results this quarter reflect the trust customers are placing in the Microsoft Cloud,” said Satya Nadella, chief executive officer at Microsoft. “From large multi-nationals to small and medium businesses to non-profits all over the world, organizations are using Microsoft’s cloud platforms to power their digital transformation.

Office 365 automatic updates, improved user-adaptability, security and newly added features like Teams and Flow has resulted it in being favoured by SMBs and corporate houses alike.

Increasing demand for Office 365 and migration of email systems is a rising opportunity for service providers as well. As per 451 Research, of all SMBs surveyed, 83% have yet to switch to Office 365.

Migration is not a simple step and includes many important stages from preparation of user accounts, directory synchronization, mailbox data migration to cutover till clean up and process completion that needs help of a managed service provider.

Categories
Cloud Interviews New Products News Technology

“We intend to build full-featured IT infrastructure solutions for SMBs”-Sergey Nevstruev, Anturis

Small and medium-sized businesses (SMBs) are growing, and with this growth, they’re changing the way they use technology to run their businesses. However, with their purse strings drawn tight and not having a clear roadmap of IT implementation, SMBs generally invest in IT in a phased manner. The problem manifolds for them as th? present market only offers solutions that are either expensive and bloated or open-source with a great need for fine-tuning and customizing. With limited knowledge of technology and their budgets tight, none of the options seem feasible for SMBs.

This is where a product like Anturis comes in. Promising features at par with enterprise-level IT monitoring softwares, without the exorbitant prices that generally accompany them, Anturis sounds like a pretty solid service that can play a strategic role in businesses of all sizes, helping companies do more with less to realize cost savings and profitability. Plus, it’s in beta and currently free for the first six months of use. What’s not to love?

More on it from Mr. Sergey Nevstruev, CEO, Anturis himself, with whom we recently had an opportunity for a Q&A session.But before that, let’s have a look at this comprehensive demo below for a firsthand look at the benefits of Anturis monitoring solutions.

At Anturis, we intend to fill the gap and build full-featured cloud IT infrastructure monitoring and troubleshooting solution for SMBs: Easy to set up, reliable and affordable.

– Sergey Nevstruev, CEO, Anturis.

Mr. Sergey Nevstruev, CEO, Anturis.
Mr. Sergey Nevstruev, CEO, Anturis.

Q: What is your name and role with Anturis?

A: Sergey Nevstruev, CEO.

Q: For those who don’t know what Anturis is, can you please brief it a bit?

A: A vanguard IT solutions company, Anturis Inc. is the developer of IT infrastructure monitoring and troubleshooting solutions for small to medium sized businesses. Anturis, now available in beta, delivers organizations of all kinds a 24×7 comprehensive monitoring and troubleshooting service that is both feature rich and easy to set up and use. Anturis, Inc. was founded by successful IT entrepreneurs Serguei Beloussov, Max Tsypliaev and Ilya Zubarev.

Q: Anturis comes across as a service that provides the best of both worlds. It promises to deliver features at par with enterprise-level IT monitoring software, without the grotesquely high prices that generally accompany them. What was the thought process behind coming up with this initiative? Were SMBs your primary target from the word go?

A: Anturis’ main target is the SMB marketplace. As more and more SMBs come online, their businesses rely on online services. Restaurants allow customers to book reservations online, medical clinics and hair stylists now allow clients to make appointments online as well. As SMBs are more online now than ever before, most don’t have an IT department at all. Servers are hosted and software is installed by a part-time IT administrator. They need enterprise-level solutions that will not only monitor websites, but their entire online service, as well as servers (CPU, processes, HDD free space, etc.) Most cannot afford to spend an exorbitant amount of money on these monitoring services and many do not have a qualified IT professional. Anturis offers the ultimate solution: An affordable, yet extremely compressive service, that does not require the skills of a dedicated IT person.

Q: Anturis is currently in the Beta period and is available for free. What are the features provided in this period? And how has the response been so far?

A: The Anturis beta launch is aimed to kick-start our product and service at an increase rate as well as to build a strong community. It is not something usually referenced as “beta testing” against an artificially selected pool of users with a main purpose to catch bugs. In our case, we are providing full functionality for free, so the Anturis beta features include full-cycle monitoring. These features include:

  • Collect: A comprehensive and convenient cloud-based monitoring approach to data collection. Drills down through every layer of infrastructure. Can easily monitor across distributed platforms, data centers or branch offices around the world.
  • Analyze: Analyzes collected data prior to alerting you to any potential concerns. Correlates related issues from different parts and layers of the IT infrastructure.
  • Alert: Generates meaningful and actionable alerts.
  • Report: Gives you the option to view detailed information about specific problems as well as get historical perspective.
  • Troubleshoot: Provides the tools to make solving IT problems faster and easier.

As the Anturis beta just recently launched, we are experiencing organic growth and have already received much positive feedback. The overall Anturis beta response to-date mainly confirms the value of an easy-to-set up and use solution for the SMB market.Anturis

Q: How long will the beta period last?

A: If users sign up today, they will receive the Anturis beta for free for six months. We have several goals to reach with our beta. This includes building a good base for commercial launch, verification of marketing channels and so on. We plan to achieve this in 3-5 months. After that, we will move to the next commercial stage.

Q: How much will the commercial plans cost after the beta period is over?

A: The price will start from a low 2-digit figure (per month) and will depend on the amount of monitors (measured parameters) needed.Our commercial plan is still in development and we are not ready to provide specific price points at this time. The main goal is to present plans that fit companies with different IT infrastructure sizes and needs. There will be also a free plan once the commercial solution launches.

Q: Anturis sounds like a very promising product; but the competition to support open source technology is pretty intense out there. Nagios, the current industry standard for IT monitoring services, has prices far less compared to enterprise level software and somewhat falls in the same league as yours. How do you plan to stand out?

A: Nagios is a great solution, and IT professionals really enjoy it. However, to use Nagios you have to be a well-qualified IT expert. Both to set it up as well as to use the solution. It also requires installation of many parts. That means that even though the price of the solution is not high (or free) there are associated other costs. Even if you are an expert, you can’t start monitoring in 5 minutes. You definitely can with Anturis.

Q: What value does Anturis offer in the Cloud arena?

A: It’s now obvious that everything moves to the cloud. Hosted IT infrastructure market CAGR is 25%. So why shouldn’t IT infrastructure management tools follow the trend and move to the cloud as well? Cloud technology brings a whole new generation of IT solutions, which are easier in use and much more affordable for smaller businesses. As an example, you may rent a cloud server with just several mouse clicks and it will cost you several dozen dollars per month. No upfront cost. No need for a full team of skilled IT professionals. Legacy IT management/monitoring software solutions (whether commercial, enterprise or open-source) look a bit cumbersome in the new Cloud context. When you can add a new virtual server in one minute you wouldn’t spend ten minutes to configure monitoring for it.

At Anturis, we intend to fill the gap and build full-featured cloud IT infrastructure monitoring and troubleshooting solution for SMBs: Easy to set up, reliable and affordable.

Q: What plans do you have in store for 2013?

A: Today, we start with the Anturis beta mainly in the US. By the end of the year Anturis will go commercial. We also will target additional markets globally (Europe, Russia) and work towards expanding our user base and build our customer levels.

Categories
New Products News Technology

Liquid Technologies Releases Liquid XML Studio 2013

Liquid Technologies Ltd, a privately owned software vendor in the UK, have officially presented the latest release of their well-known and best selling Liquid XML Studio software, Liquid XML Studio 2013. Liquid XML Studio has long been identified as an XML toolkit and IDE, containing each of the tools essential for designing and developing XML applications, which includes an XML Editor, XML Schema Editor, WSDL Editor, Web Service Tools etc.

The most recent 2013 model is made up of many new features as well as enhancements plus general performance and stability improvements. Liquid XML Studio 2013 extends and improves on the pre-existing functionality by adding other new tools and technologies that end users have asked for in order to continue to make Liquid XML Studio the very best value XML development environment available, which makes it a truly all in one xml toolkit.

New for 2013 is code generation as well as runtime support for mono touch android as well as mono touch iOS, which is major milestone in the companies history and provides for Smartphone app coders to utilise Liquid XML Studio for developing Smartphone apps for both the android and apple operating systems.

New for 2013 is code generation as well as runtime support for mono touch android as well as mono touch iOS, which is major milestone in the companies history and provides for Smartphone app coders to utilise Liquid XML Studio for developing Smartphone apps for both the android and apple operating systems. The component is delivered through Liquid Studio’s Data binding wizard which provides code generation and also Liquid Runtime support for MonoTouch™ for iOS and Mono for Android™ with .Net Framework 4.0, which includes project files, source C# and VB .Net files. Currently the function is just found in the developer version of the software.

Some other brand new developments incorporated are:

Real time background validation – xml files and xml schemas will be automatically validated as you construct your document or schema with errors exhibited in a different pane.

Real time spell checker – xml documents can be spell checked in real time whilst you build your document. The checker makes use of “camel-case” checking and is actually fully xml aware that means it is able to recognize between coding tags and content.

In addition to a host of other advancements and latest features for instance breadcrumb navigation of elements, WSDL 2.0 Visual studio plugin, Code generation and Liquid runtimes for Visual Studio 2012 plus much more. Liquid XML Studio is in use by many of the greatest corporations worldwide throughout a number of sectors. In the past, glowing acknowledgements have come from global firms such as HSBC, Microsoft, Nokia, HP, Cisco and Walmart.

XML Studio 2013 was formally presented on 1st February 2013. For more information regarding Liquid XML Studio click here. For information regarding the new 2013 features, click here.

Page 1 of 2
1 2