The usage of Linux on Azure has exponentially surpassed the Windows, confirmed Microsoft Linux Kernel developer, Sasha Levin, to ZDNet.
The battle between Windows and Linux has been going on for over a decade. While Windows became a clear dominant OS on desktops, the Linux has won the battle on server.
In 2016, Azure CTO Mark Russinovich had revealed that 25% of the Azure instances were Linux, which increased to 40% the next year. Then in 2018, Microsoft told ZDNet that around 50% of Azure VMs were Linux.
This shows that Linux hasn’t won the battle overnight. More and more enterprises are choosing Linux over Windows when it comes to server.
“Every month, Linux goes up,”Scott Guthrie, Executive VP of the cloud and enterprise group, Microsoft told ZDNet in September last year.
Microsoft users have been actively choosing Linux and open-source software for over 10 years, since Microsoft open-sourced ASP.NET. “We recognized open source is something that every developer can benefit from. It’s not nice, it’s essential. It’s not just code, it’s community,” said Guthrie. “We’re now the largest open-source project supporter in the world.”
Now, there are almost a dozen of Linux distros available on Azure, that too without considering the Microsoft’s own Azure Sphere.
The Facebook–Cambridge Analytica data scandal of early 2018, saw a breach of 87 million user records. An app scraped millions of people’s data, and the Cambridge Analytica Company was able to gain access to personal data of Facebook users.
Similarly, Quora, the popular question-answer website and Marriot also reported of data breaches involving information compromise, affecting about 100 million users and 500 million hotel guests, respectively.
Numerous data breaches occurred in 2018 and the rate of occurrence is getting higher each year.
The cyberthreat landscape continues to become more dangerous in the world that is highly connected by technology today. Cybercriminals of digital world are more organized, smart, and potent. Thus, cyberattacks have become a major industry worldwide, worth trillions of dollars.
Nevertheless, forewarned is forearmed. To fortify a business, the best solution is to essentially understand the tendencies and trends that malefactors have been following and adopt best practices of cybersecurity.
The recent Microsoft commissioned Frost & Sullivan study revealed that “potential economic loss across Asia Pacific due to cybersecurity breaches can hit a staggering US$1.745 trillion — more than 7% of the region’s total GDP of US$24.33 trillion”.
Focused on the state of cybersecurity in Asia Pacific, this study reveals some alarming facts about cybersecurity threat landscape in India too.
It states that in India, cybersecurity threats cancost large organizations an average of US$10.3 million and a mid-sized organization an average of $11,000 annually.
The new commissioned study further reveals that more than three in five organizations (62%) surveyed in India have either experienced a cybersecurity incident (30%) or are not sure if they had one as they have not performed proper forensics or data breach assessment (32%).
Large number of organizations do not conduct routine cybersecurity assessments or reviews to find out whether they have been victims of cyber security breaches.
So, if IT departments are not routinely checking whether their systems have been infiltrated or not, then they are putting their companies at a great risk. Knowing your system breaches can allow you to assess and exploit the weaknesses, fix them and evaluate the damages.
Key takeaways from Microsoft commissioned Frost & Sullivan study
1. Cybersecurity concerns delay Digital Transformation plans
59% enterprises have put off their digital transformation efforts due to the fear of cyber-risks.
In this digital transformation era, securing corporate data and managing risks is the top most priority for the business decision makers and IT leaders, while taking advantage of the opportunities presented by today’s mobile-first, cloud-first world.
Report states that cyber security breaches result in significant losses like financial loss, damage to customer satisfaction and market reputation, for organizations. Thus, these incidents undermine the ability of Indian businesses to capture opportunities of digital economy.
Keshav Dhakad, Group Head & Assistant General Counsel, Corporate, External & Legal Affairs (CELA), Microsoft India said “As companies embrace the opportunities presented by cloud and mobile computing to connect with customers and optimize operations, they take on new risks. With traditional IT boundaries disappearing the adversaries now have many new targets to attack. Companies face the risk of significant financial loss, damage to customer satisfaction and market reputation—as is evident from high-profile breaches this year.”
2. Remote code execution, data exfiltration, multiple security tools and complex environment are the keyconcerns of organizations witnessing cybersecurity incidents
The study also reveals the key cyberthreats and gaps in Indian organizations’ cybersecurity strategies. It states that organizations in India which encounter cybersecurity incidents, face threats with slowest recovery time like remote code execution and data exfiltration.
A large number of cybersecurity tools and a complex environment also add to the turnaround time. So, organizations should avoid deploying a large portfolio of cybersecurity solutions to render stronger protection.
Survey revealed “24% of respondents with more than 26 to 50 cybersecurity solutions could recover from cyberattacks within an hour. In contrast, 32% of respondents with fewer than 10 cybersecurity solutions responded that they can recover from cyberattacks within an hour”
3. Cybersecurity is an afterthought for most of the organizations
37% businesses don’t consider cybersecurity strategy as a strategic business enabler rather they consider it as a “safeguard” against cyberattacks. Only 18% see cybersecurity as a digital transformation enabler.
Only few organizations consider cybersecurity while initiating any digital transformation project, rest either think about it only after they start on or do not consider it at all. This leads to insecure products going out into the market.
Three kinds of losses that could result from a cybersecurity breach
Direct Losses include financial losses associated with the incident such as loss of productivity, fines, remediation cost etc.
Indirectlosses include opportunity costs such as loss of customers and reputation.
Induced losses include impacts on the broader ecosystem and economy, such as loss of jobs, the decrease in consumer and enterprise spending. As per study 64% organizations have suffered job losses due to Cybersecurity attacks, over the last year.
4. Artificial Intelligence (AI) will act as a key equalizing factor in cybersecurity defense
92% of Indian organizations who were surveyed are looking to leverage Artificial Intelligence to boost their cybersecurity strategy.
Study also reveals that 22% of Indian organizations have already witnessed benefits of using AI to achieve faster and more accurate detection of threats.
AI’s ability to detect and act on attack vectors is based on data insights, so organizations using AI are equipped with predictive abilities that will help them to rapidly analyze and respond to unprecedented quantities of data and match the speed of cyberattacks’ frequency, scale and sophistication. They will be able to fix or strengthen their security posture prior to problems emerge. Also, they will be able to identify cyberattacks, remove persistent threats and fix bugs.
5 best practices to improve defense against cybersecurity threats
It is always better to find a problem early and address it quickly.
“The ever-changing threat environment is challenging, but there are ways to be more effective using the right blend of modern technology, strategy, and expertise. Microsoft is empowering businesses in India to take advantage of digital transformation by enabling them to embrace the technology that’s available to them, through its secure platform of products and services, combined with unique intelligence and broad industry partnerships.” – Mr. Dhakad
The Frost & Sullivan report recommends a set of key practices for organizations to improve their cyber threats defense.
Consider cybersecurity as a digital transformation enabler:Establish a connection between digital transformation efforts and your cybersecurity practices as cybersecurity is a necessary to keep the company safe through its digital transformation journey, whereas digital transformation provides an opportunity for cybersecurity practices to embrace new methods of addressing digital risks.
Over 90% of cyber incidents can be averted by maintaining the most basic best practices.
Leverage well- integrated best-of-suite tools instead of using maximum: Prioritize your best suite of tools, reduce their number and make your security operations simple to help your employees do their best with the available tools.
Assess and review compliance continuously:Maintain continuous state of compliance. Conduct assessments and reviews regularly to test for potential gaps that may occur during the transformation of organization and address these gaps. Keep tab on compliance to industry regulations and progress of organization against security best practices.
Leverage AI and automation to increase capabilities and capacity: Organizations should look to automation and AI to improve the capabilities and capacity of their security operations. It will help them to –
Raise detections that would otherwise be missed.
Interpret the various data signals with the recommended actions.
Free up cybersecurity talents to focus on higher-level activities.
So, it’s time for your IT checkup – follow the trends like cybercriminals do, conduct regular cybersecurity assessments, focus on cyber hygiene and ensure strong security fundamentals. Either take cybersecurity action yourself or leave it to somebody else to do it for you.
Do remember, it is you who leave open your devices or systems to vulnerabilities. These threats are manageable, but it is up to you to do your part.
Today, the world is witnessing fourth industrial revolution. Everything is evolving due to rapid rise of new technologies and fusion of physical and digital spheres. Technology is the most important factor today that is bringing about this revolution. Microsoft’s technologies are no exception and its almost each day that Microsoft is updating them and improving them to make the world a better connected, developed and secure place to live and work in.
In July 2018, Microsoft announced the End of Support for SQL Server 2008 and 2008 R2 and Windows Server 2008 and Windows Server 2008 R2 in mid-2018.
The 2008 release cycle saw a shift from 32-bit to 64-bit computing, advanced analytics and budding server virtualization technology. The new decade marks the era of hybrid cloud, artificial intelligence and other technological innovations.
What does SQL Server and Windows Server 2008 end of support mean for my business?
Microsoft offers 10 years of support to its servers – 5 years for Mainstream support and 5 years as Extended Support, under its lifecycle policy.
End of Support for SQL Server will end on July 9, 2019, and for Windows Server on January 14, 2020.
End of support means Microsoft will not be sharing any security updates or any other kind of support for the 2008 Windows and SQL servers, post the deadline. Lack of security updates will increase the risk of your infrastructure and expose it to cyber-criminals. Also, with no security updates, companies can face several compliance and standard issues. Especially, with GDPR regulations in action, you should not take risk for your business and incur any penalties.
The gravity of the situation lies in the increased risk of cyberattacks and other vulnerabilities on businesses which are not running on the latest server versions.
A report by Symantec, states:
There has been an increase of 92% in new malware downloader variants.
46% increase in new ransomware variants.
600% increase in attacks against IoT devices.
You can clearly understand the risk you can put your business in, by not upgrading the software technology.
How to prepare for SQL Server and Windows end of support?
Microsoft ensures that its customers are completely supported during this phase of transition. The company introduced two new options to help organizations transit to the new decade.
New options for SQL Server 2008 and Windows Server 2008 End of Life
With the deadlines approaching fast, customers have very less time left with them to take an action.
Microsoft suggests its users to upgrade to the latest versions of both the software. This will help them leverage software assurance benefits for reduced security risks and continued security updates. However, for the customers who will not be able to make this transition by the end of the deadline, Microsoft has introduced new options:
Extended Security Updates by Migrating to Azure
For organizations, which are still running their infrastructure on-premises, the end of life is a golden opportunity to make a shift to the cloud. However, it is easier said than done. Hence, Microsoft is offering Extended Security Updates for SQL and Windows server 2008 for free in Azure for both 2008 and 2008 R2 versions of each. The organizations can:
Rehost their SQL Server 2008/2008 R2 and Windows Server 2008/2008 R2 in Azure SQL Database Managed Instance with little to no code changes. Thus, getting a version free platform.
Move to Azure Virtual Machines and upgrade to a newer version when they are ready. Here also, the customers get three years of extended support at no extra charges.
Customers can use the existing licenses and save nearly 55 percent with Azure hybrid benefit. In case of Windows Server, they can save nearly 80% on Azure VMs through Reserved Instances and Hybrid benefits in Azure.
Customers do not need to have a Software Assurance when moving to Azure. However, they might require it if they wish to leverage Azure Hybrid Benefits.
This seems to be the most straight-away solution i.e. to upgrade to SQL Server 2017 and Windows Server 2016.
SQL Server 2017 is built for greater performance, security, availability and innovation with intelligent cloud analytics.
Customers who are running Windows Server or SQL servers under licenses with an active Software Assurance under an Enterprise Subscription Agreement (EAS), Enterprise Agreement (EA) or Server and Cloud Enrollment (SCE) can also purchase Extended Security Updates for three years post end of support deadline. The catch however, is that the customers will be able to buy security updates only for those servers they need to cover.
It should be noted that only Datacenter, Enterprise and Standard editions of SQL Server and Windows Server 2008 and 2008 R2 will be eligible for Extended Security Updates. Customers will need to get updated on the latest service pack for both the services in order to receive Extended Security Updates.
When will the Extended Security Updates option be available?
Those who opt for Azure migration, can begin migrating the workloads to Azure VMs immediately. They can apply security updates until the end of life deadline approaches. Once the deadline is over, Extended Security Updates will become automatically available for giving you continued coverage.
For those who opt for staying on-premises or on hosted environment, Extended Security Updates will be available for purchase as the deadline for end of life approaches. Microsoft will be announcing specific date for this purpose. Extended Support will be delivered immediately after the deadline ends.
Extended Security Updates for SQL Server 2008/2008 R2 and Windows Server 2008/2008 R2 will include provision of security updates and other bulletins that are rated critical. These will be available for a maximum of three years post deadline.
This offer will not include:
Any technical support. Customers will have to buy Microsoft support plans if they need assistance on 2008/2008 R2 questions.
Any offer including, new features, design change, non-security hotfixes etc.
There will be no retroactive effect for any of the updates that was declined by the engineering teams in the past.
How much the Extended Security Updates cost?
In Azure: Customers who are running Windows Server or SQL Server 2008/2008 R2 in Azure will be getting Extended Security Updates at no extra charges above the standard VM rates.
Customers who will be moving to Azure SQL (PaaS) database managed instance will not need the Extended Security Updates as it comes as a fully-managed solution. It is always patched and updated by Microsoft.
Hosted: Customers will have to purchase Extended Security Updates for 75% of full on-premises license cost per year and later use them in a hosted environment.
On-Premises: Customers who own any active subscription licenses or software assurance will be able to purchase the Extended Security Updates for 75% of the EA license cost annually. They can also reduce cost by paying only for the servers they need to cover and gradually upgrade the environment.
Our take on the new options
Whether you choose to stay on-premises or in a hosted environment or consider this opportunity as a chance to make your move to the cloud, the only wrong choice you can take is by not making any choice at all.
Ceased support for the servers, is a new opportunity to innovate and explore new options in the cloud or on-premises.
As per a recent forecast by Gartner, Global IT Spending will reach $3.7 trillion in 2018 with main drivers of growth being projects in digital business, blockchain, Internet of Things(IoT), and progression from big data to algorithms to machine learning to artificial intelligence (AI).
Today, software has become critical for almost every organization. They rely heavily on software, nearly for every function of their business. Alignment of IT to business, increased competition and eternally shifting digital landscape are the important reasons for this dependency.
Given the increasing importance of software for every digital business, IT enterprises are doing more application development and are looking for best DevOps practices to deliver software faster, continuous delivery workflow with improved reliability and with minimum errors.
What is DevOps?
DevOps is about quick and better delivery of software or IT services with the adoption of agile technologies and lean practices.
DevOps teams utilize automation tools, preferably to leverage an increasingly programmable and dynamic infrastructure from a life cycle and to optimize and automate their entire software production system, while maintaining high quality and security.
Why DevOps is important for digital transformation?
One of the key findings of 2017 State of the DevOps Report say that “DevOps practices are the foundation of every company’s digital transformation, and digital transformation impacts a company’s performance“.
Living in the age of digital transformation, businesses require more revolutionary method to get work done, tools that manage business, cross functional teams performance, and user data all from one platform.
DevOps is a methodology which brings development and operations together and facilitate improved collaboration and information sharing between two cornerstones of an organization. This empowered collaboration helps in the rapid production of qualitative products and services and also in improving the products. Use of DevOps also ensures the quality of application updates and infrastructure changes.
With DevOps, organizations get the ability to release new features and fix bugs at faster pace, thus respond and serve their customers better and compete more effectively in the market.
Organizations are using Cloud DevOps for flexibility, ease of use and speed.
helps an IT/ digital organization to deliver more value and of higher quality, provide its users with an exceptional experience, recover faster from production and infrastructure outages and prevent failures.
Organizations utilize software for different aspects, like providing better customer experience or engaging with them. Sometimes DevOps adoption enables delivering software that customers touch, like customize your shoes ability on Nike.com or making a direct claim facility on Travelers mobile app. Sometimes it means delivering software which the customer has never seen or a software that empowers both company’s employees and the partners with better experience.
2017 State of DevOps Report reveals strong evidence that best DevOps best practices lead to higher IT performance and companies which have adopted DevOps tools and DevOps principles, have reported to achieve –
46 times more frequent software deployments than their competitors.
96 times faster recovery from failures.
440 times faster lead time for changes.
5 times lower change failure rate.
Operational efficiency and improved levels of customer satisfaction.
Thus, with agile DevOps, all digital organizations – profit and non – profit, are able to achieve their goals, no matter what their mission is.
What affects an organization’s ability to develop and deliver software?
Leadership effect on technical practices, and process improvement changes.
Application’s architecture, and the structure of the teams that build it.
Automation and management practices.
Thus, an organization needs to practice some principles, essential to DevOps success and business outcomes, simultaneously.
5 key DevOps principles and practices for successful business outcomes
1. Cultivate and develop high performing teams
By 2020, 50 % of the CIOs who have not transformed their team’s capabilities will be displaced from their organizations’ digital leadership teams – Gartner
So, for successful DevOps transformation, DevOps companies need to have high performing leaders with transformational characteristics, on board.
What does it mean to be a transformational leader?
Transformational leader is an engaged leader who enables those practices which correlate with high performance.
He supports team experimentation and innovation for faster and better creation and implementation of products. He works across organizational silos for strategic alignment and supports effective communication and collaboration between team members in pursuit of organizational goals.
He is an engaged leader who implements technologies and processes that enable developer productivity, reduces code deployment lead times and supports more reliable infrastructures.
Characteristics of a transformational leader that are correlated to high IT performance:
Thus, digital organizations should aspire to build high performing teams.
2. Achieve faster throughput without ignoring stability, build quality into the process
Presence of leaders with transformational characteristics is not enough to achieve high DevOps outcome and high IT performance.
IT performance is measured along two dimensions, throughput of code i.e. deployment frequency and change lead time and stability of systems i.e. mean time to recover and change failure rate.
High performing DevOps teams ideally have increased deployment frequency, faster change lead times, reduced mean time to recover(MTTR) and low change failure rates.
They invest in building quality into the process and this gives them an advantage of high customer satisfaction. They have many more chances to deliver new value.
The outcome is faster time to market, better customer experience, and higher responsiveness to market changes.
3. Automate your DevOps process for faster feedback cycle and innovation
Automate your manual tasks, particularly configuration management, testing, deployment, and change approval. This DevOps automation is likely to free your technical staff for more innovations and will quicken feedback cycle simultaneously, thus adding more value to the organization.
4. Shift to loosely coupled services for higher throughput, quality and stability
DevOps success of an organization also depends on a suitable architecture and good technical practices.
Another DevOps practice essential to achieve higher IT performance and continuous delivery is to allow practitioner tool choice, based on the way of working and the tasks to be performed.
Shift to those services which can be developed and released independently. Build such architecture and teams which are loosely coupled – free of dependency. Design architecture of your delivery teams such that teams can test, deploy and change their systems without depending on other teams for additional work, resources, or approvals, and without back-and-forth communication.
5. Practice lean product management and lower deployment pain
Work in small batches, set your development teams free. Entrust them with authority to create and change specifications without requiring approval, of course as part of development process and seek users’ feedback as an input, actively.
This will improve and quicken your software delivery pipeline and will help you in continuous and speedy delivery of what your customers want.
One needs to acknowledge how an organization gets its work done, how it is structured, how its processes and teams work for the success and failure of DevOps in an organization.
Blockchain is the technology that is likely to have a significant impact on the world in next few decades. The term ‘blockchain’ is most of the time heard with bitcoin and other cryptocurrencies. It is because of the fact that blockchain is the underlying technology of all bitcoin transactions. However, blockchain is more than just crypto-technology.
According to Marc Andreessen, co-founder of a Silicon Valley’s venture, blockchain is the most important invention since the internet. Quintessence magazine wrote that blockchain should be considered an invention as significant as the steam engine, that can transform the world of finance and beyond.
It has the potential to transform the way we approach big data today, with industry-leading security and data quality. Blockchain has been working flawlessly for last few years, and is used in both financial and non-financial applications.
What is blockchain technology?
A blockchain is simply a ‘chain’ of ‘blocks’, where digital information is divided into ‘blocks’ and ‘chained’ together.
At a deeper level, it is a secure, digitized and distributed public ledger of the executed transactions or digital information which is shared by participants in a system. The blockchain ensures immutable transactions and establishes trust between the parties who exchange the information or money.
The database in a blockchain is stored in a distributed manner, which means there is no centralized version of any information and no hacker can corrupt or steal it.
The information executed and shared in a blockchain can’t be deleted. It keeps verified record of all the transactions, with the consensus of all the participants in the system.
For example, it is easier to steal a diamond which is kept at an isolated place, than stealing the diamond from a jewelry showroom, being observed by hundreds of people.
Who invented blockchain?
The concept of blockchain was given by Satoshi Nakamoto, the same person/group who invented bitcoin in 2008. Satoshi Nakamoto didn’t reveal his identity and faded from the community in 2011.
The blockchain came into implementation in 2009 as a core component for bitcoin cryptocurrency, to serve as a public ledger for all the transactions on network.
Strengthened with blockchain, bitcoin became the first digital currency to solve the double spending problem without requiring a trusted authority and has been providing inspiration for developing many solution applications.
How blockchain solves Double Spend problem?
When a Word document, PowerPoint, text message or an email is sent to anyone, a copy is sent, rather than the original. The sender still has the original copy of the file. This technology is democratized and is useful. But when it comes to financial transactions, it doesn’t work.
For example, if you’re sending $100 to someone, you can’t send a copy of it. To solve this problem, everyone relies on certain trusted authorities like governments, banks, payment companies, etc. Considering non-financial cases, it is an email service provider who processes all the emails; it is a social media platform that tells us that a post or message has been shared with the intended user, etc.
This process makes all the transactions centralized, which are at a point of attack, even when the authorities try their best to secure everything.
Blockchain technology has the potential to eliminate this issue, by enabling a distributed consensus on all online transactions. There are no risks of data breach or compromising the privacy.
How does blockchain work?
In a blockchain, all the transactions in a network gets chained in the form of blocks. Every block contains reference to the previous block, and a hash of the data used in the block.
Bitcoin is the most known application of blockchain technology, therefore we are using its reference to learn how the blockchain works. (To know what is bitcoin, and should you invest in it, visit our article about bitcoin)
Blockchain is used to keep track of all the bitcoin transactions in the form of a ledger file. This ledger file is not centralized, rather distributed across the network of private computers. All the computers in the network will have a copy of the ledger file, and will know about all the transactions in the network.
For example, there are five users in a blockchain network named A, B, C, D, and E, having some number of bitcoins in their wallet.
If user D wants to send 15 bitcoins to user A, then the transaction is represented online as a block. This block is broadcasted to every user in the network. If a majority of users in the network approves that the history and signature of the block is valid, then the new transaction is accepted into the ledger and a new block is added to the blockchain. The bitcoins will then be sent from user D to A.
The blockchain technology is getting enormous interest from numerous companies and startups. The leading financial organizations like Visa, Mastercard, NASDAQ, and several banks are significantly investing in blockchain to explore its applications in their current business models.
The telecom system integration market will touch $25 billion by 2022, from $16.56 billion in 2017, growing at a CAGR of 8.7% during the forecast period, as per a report by market research firm, ASDReports.
The adoption of telecom system integration services by communication service providers was majorly driven by demand to handle cloud and migration activities.
In telecom industry, the network infrastructure plays a key role in central operations. The advancements and developments in networking technologies drove the adoption of these innovative technologies among the Telcos.
Other than advancements, the varied customer expectations and fierce competition in the market also encouraged the telecom industry to embrace telecom system integration services.
The telecom system integrators help Telcos to efficiently integrate their network monitoring solutions and network security with an existing network infrastructure. The system integrators reduce the risk related to integration and provides stability to network infrastructure.
The system integration services will witness significant growth in cloud-based solutions during the forecast period. The rising demand in data storage and mobility will be the key factors to aid the growth.
Evolution of 5G networks, migration from wired to wireless network, and transition from IPv4 to IPv6 technologies will also significantly contribute to growth of network management market.
By region, North America is expected to be largest contributor to global system integration market during the forecast period. The reason behind North America’s major share in the market is presence of leading telecom and communication service providers in the region including AT&T, T-Mobile, Verizon, and Sprint Corporation.
IBM, Ericsson, Nokia Networks, Tech Mahindra, Huawei, Wipro, Infosys, DXC Technology, Cognizant, HCL, Syntel, and Stixis Technologies are the major telecom system integration service providers worldwide.
Automation-as-a-service (AaaS) market is expected to touch $7.4 billion by 2023, growing at a CAGR of 27% during 2017 to 2023, as per a new report from KBV research.
The AaaS helps organizations by automating the business processes, and shifting from slow and manual processes to fast and reliable automated ones. The increasing demand for cloud services and automation, is driving the Automation-as-a-Service market.
Based on component, the report segments the market into solutions and services. In 2016, solutions market dominated the global AaaS market worldwide, and is expected to do so till 2023. On the other hand, the services market is expected to show a CAGR of 31.8% during the forecast period (2017-2023).
In 2016, North America held the largest market share in Global Operations & IT AaaS, and will continue to dominate till 2023, showing a CAGR of 24.5%.
Europe will grow at a CAGR of 25.5% during the forecast period in finance AaaS market, while APAC region is expected to witness a CAGR of 30.5% in human resource AaaS market.
By type, the rule-based automation dominated the market in 2016, and is expected to be a dominant till 2023. While knowledge-based automation will grow at a CAGR of 31.2% during the forecast period.
Banking, Financial services and Insurance (BFSI) held the largest market share in 2016, and will continue to do so till 2023, showing a CAGR of 24.9%.
Healthcare market is expected to touch $926.9 million by 2023, while retail market will grow at a CAGR of 27.9% during 2017 to 2023.
KBV Research also elaborated profiles of leading companies including IBM, Microsoft, HPE, Pegasystems, Blue Prism, Automation Anywhere, etc. It considered the key strategic developments of these companies including Mergers & Acquisitions, product launches, partnerships, etc.
The NexGen 2017 Conference was specifically designed for cloud solution providers, managed service providers and other IT solution providers who bring new business models built around next-generation technologies. They act as the innovators who leverage the cloud and cloud-based technologies to drive new revenue and future profits.
The event kickstarted with a keynote session by Tom DelVecchio, Founder, Enterprise Technology Research, who talked about the things that are driving the containers and the microservices market. Microservices lower capex and reduce time between release cycles that help in improving productivity and scalability.
Dorothy Copeland, Vice President, Global Business Partners, North America, IBM, talked about innovation related to data, IoT and blockchain. She identified 5 eras of innovation – Centralized Computing, Decentralized Computing, Data, IT and Intelligent Services (AI, IoT blockchain). She also talked about countless opportunities that exist in each era particularly in – cloud and artificial intelligence, Internet of Things, and blockchain.
Later, Noah Johnson, Account Executive, Lenovo, discussed Lenovo’s journey to SDDC. He also talked about the significance of SDI (Software-Defined Infrastructure) technology for simplifying organizations’ cloud adoption process.
Asokan Ashok, CEO of UnfoldLabs discussed about the world of artificial intelligence and major trends to look forward to in 2018.
He discussed top 8 trends in the world of AI: Trend 1: Large companies like Amazon, Google, Facebook, and IBM are set to lead the way in AI.
Trend 2: The market will see the consolidation of Algorithms & Technology.
Trend 3: AI companies will go after crowdsourcing large volumes of data.
Trend 4: There would be increased M&As (Mergers and Acquisitions).
Trend 5: Companies will open source their AI tools and algorithms to gain larger market share.
Trend 6: There would be more interactions between humans and machines.
Trend 7: AI will start having an impact on all major industry verticals.
Trend 8: The rise of AI will also bring various privacy, security, ethics and moral issues in AI.
There were also sessions on cyber and endpoint security by Shannon Lucas, Senior Systems Engineer, FireEye. Laurie Potratz, VP, Global Channel and Alliances, LookingGlass Cyber Solutions, Sarah Morgan, Channel Account Manager, Webroot and many others. Managed services and security were identified as the key services to be on top list of organizational must-haves in 2018.
IoT was also a major point of discussion at the event with Stephen DiFranco, Founder at IoT Advisory Group talking about a partner’s IoT journey in his executive session. He discussed the upcoming IoT trends and how the coming years will see a huge rise in the usage of IoT devices by both, individuals and organizations.
IoT services revenue will double by 2021 with retail and healthcare representing services rich industries for a partner ecosystem, per the IoT Advisory Group.
The session headed by Bradley Brodkin, President & CEO of HighVail Systems, Inc. highlighted the role of containers in digital transformation. He said that containers have opened a world of opportunities for DevOps and have brought a multitude of business opportunities. This include empowering DevOps with modern application tools, transforming data center, enabling free movement of applications between public and hybrid cloud and automation of processes.
Solution Providers are looking for new skills and processes to deliver multi-cloud environments. For this, solution providers must introduce changes within their own organizations to support the multi-cloud environments.
The event is hosted annually by The Channel Company, and this year it was attended by vendors, solution providers and distributors. The attendees got the chance to interact and learn strategies from trusted partners to build the next generation business model based on advanced technologies.
The NexGen 2018 Conference & Technology Expo will be held on November 27-29 in 2018.
AT&T, the leading telco, has collaborated with Indian multinational IT provider Tech Mahindra, to build an open source AI project named Acumos, hosted by the nonprofit Linux Foundation.
The companies are providing this platform with an intention to make artificial intelligence accessible to everyone by making the AI apps development easier.
“With the Acumos Project, AT&T and Tech Mahindra are leading the way in bringing AI and machine learning tools to the developer community,” said Jim Zemlin, executive director of The Linux Foundation.“Making it easy for developers to get involved and to really steer the ecosystem around the project in the direction they want it to go is the key to making the Acumos platform successful.”
While the Linux Foundation will sustain and host the Acumos project, AT&T and Tech Mahindra will contribute with the programming codes. Acumos, the extensible framework for machine learning solutions, will offer the capabilities to compose, edit, integrate, train and deploy the AI microservices.
“Our goal with open sourcing the Acumos platform is to make building and deploying AI applications as easy as creating a website,”said Mazin Gilbert, vice president of Advanced Technology at AT&T Labs.“We’re collaborating with Tech Mahindra to establish an industry standard for AI in the networking space. We invite others to join us to create a global harmonization in AI and set the stage for all future AI network applications and services.”
Most of the AI development tools are complex and are designed for data scientists, but AI is no longer a sci-fi concept. It is already a part of our data-driven world. Acumos will make artificial intelligence user-centric with smart automation and predictive analytics that will make drones, connected cars and appliances, and robots an everyday reality.
“Our investment in AI solutions over the past few years is helping us find tremendous opportunity to make it simpler for higher adoption,” said Raman Abrol, SVP & Strategic Business Unit Head at Tech Mahindra.“In collaboration with AT&T, we will help enable enterprises apply AI to reimagine business models, unlock the potential of data and drive business outcomes. Our ultimate goal with the Acumos Project is to accelerate and industrialize the deployment of AI at enterprises and get developers and businesses to collaborate effectively in order to improve how we all live, work and play.”
Acumos will be enhanced more in the future and it is expected to be able to communicate with home appliances, help self-driving cars take themselves for repairing, and improve life at home, school and work.
They covered company’s key product areas like iPhones, MacBooks, and IOS to implementation of technologies like Machine learning to improve user experience.
Apple, like many other big IT players, is headed towards bringing AI and machine learning capabilities to user’s systems and mobile devices. With this, soon iPhone, iPad and Apple watch users will see faster tasks execution and improved device usability with the implementation of AI technology.
For attaining this, the company unveiled its new machine learning framework API – Core ML, for developers.
The core function of this would be to speed up AI execution task, which can vary from simple text analysis to voice and face recognition. The technology will make image recognition on iPhone six-times faster than on Google’s Pixel, says Apple. With AI technology, voice recognition systems like Siri will be able to better understand the user demand and speak with a natural output.
The Core ML will support several machine learning tools with a primary focus on privacy. This means the data used for improving user experience will be processed and kept on device only.
Artificial Intelligence and machine learning are the technologies in focus these days by many big firms like Google, Microsoft and Facebook. Google and Facebook introduced new AI server designs earlier in March 2017 with an objective to facilitate faster server responses.