IT Portal from UK

Subscribe to IT Portal from UK feed IT Portal from UK
RSS Feed
Updated: 13 min 37 sec ago

How capacity planning can reduce IT costs by 30-50 per cent

Fri, 07/05/2019 - 04:30

Virtual workload balancing is overlooked well past the eleventh hour in many enterprises.  This seems counterintuitive, given that enterprises are becoming highly dependent on IT for success.  Capacity planning, or workload forecasting, is all about optimisation.  It is the process for determining how capable and prepared the IT infrastructure is to meet future workload demands whilst efficiently managing resources.  This process is typically measured within the context of applications that run in rapidly changing, multi layered, virtualised and often cloud based environments.  Capacity planning is critical for reducing cost and increasing business productivity.  Good capacity management also saves time and helps to protect against unforeseen challenges caused by workload changes – and saves the urgent need to throw expensive resources at solving resultant performance issues and workload bottlenecks.

Mandatory discipline

Today, capacity planning is a mandatory discipline for enterprises. The problem is that many companies either don’t have the right tools, the right staff expertise, or both. Fortunately the technology exists today to allow the majority of capacity planning tasks to be automated, doing the work more efficiently than a traditional expert would be capable of.  It is now possible to continuously apply smart predictive analytics to optimise applications that run across countless virtualised systems. Automated predictive analytics, utilising artificial intelligence for IT operations (AIOps), can help an enterprise to gain a competitive advantage by optimally maintaining the crucial balance between cost and performance. Though the deployment of AIOps, organisations can effectively optimise their application availability and workloads. AIOps capabilities powerfully predict capacity needs, proactively balance infrastructure utilisation, while automating anomaly detection, response and resolution.

A company can then efficiently meet service levels and availability requirements, without introducing undue risk.

Scenario-based, or automated? Old school capacity planners are a rare occurrence now

Some enterprises choose to rely on staff expertise and scenario-based capacity planning to include the exploration of worse-case events, as opposed to continuous automated predictive analytics that solve performance models across the entire data centre. Automated predictive analytics produce regular reports that are able to show which of the systems are likely to violate future service levels, as well as when (and why) this will occur. Conversely, scenario-based capacity planning mostly follows a one-at-time modelling. A scenario-based approach works great for fragmented “what-if” analysis, but less well for a continuous overall analysis. It will answer detailed questions such as ‘Which of the strongly cost-controlled infrastructure configurations will still meet service levels?’ Or:  ‘How will the entirety of applications respond after adding a new application to a production system?’ Whilst the scenario-based approach allows for a wide array of possible situations, including experimentation with some extreme challenges, automated predictive analytics are what capacity planners will find most useful and less time-consuming in their day-to-day monitoring of applications and systems. The automated process does not require the prohibitive amounts of up-front capacity planning time or staff expertise that the old school capacity planning model does. By relying on automated predictive analytics enterprises can be informed of future problems well in advance of their occurrence, providing sufficient insight to enable organisations to avoid issues entirely.

Application performance and utilisation is optimised by continuous re-balancing of the underlying infrastructure, including VMs, network paths, and storage load distribution.  State-of-the art custom AIOps analytics use machine learning to memorise VM workload patterns, then recommend the optimal placement of VMware ESX clusters to proactively avoid memory or CPU contention. The best analytics tools on the market also use machine learning to monitor port utilisation on storage ports, and provide recommendations on rebalancing ports on the storage array.  Enterprises deploying such tools can reduce their infrastructure costs by 30-50 per cent after deployment by avoiding unnecessary infrastructure spending.

Looking at the latest market developments, good workload forecasting and AIOps will always be part of an advanced infrastructure performance management (IPM) solution. Infrastructure performance management utilising AIOps capabilities is key when it comes to activities such as system measurement, application monitoring, and general optimisation. An efficient IT infrastructure relies on highest levels of transparency to be able to proactively control or avoid issues.

Infrastructure performance management (IPM) - common beliefs

IT environments are invariably managed by humans so some common beliefs persist: ‘I don’t need any IPM unless I have a real issue, hardware is available at reasonable prices and hence replaces constant IPM solutions – one can always throw a new device at an issue’, ‘IPM and forecasting once a year will do’, ‘IPM and capacity management take up too much time’, ‘Analysis of workloads and general interpretation is too complex and mostly obsolete’.

Why do these attitudes persist when there are so many reported cases of outages in the press, as well as innovative solutions in existence to solve these issues? Many enterprises, especially throughout Europe, are slow to realise that their IT infrastructure lacks transparency. Appropriate measuring tools capable of mitigating performance issues and delivering valuable insights on future deployments in support of growing business requirements, will make an enormous difference in assuring business continuity. Just because a major disaster has not hit yet, or sudden issues can be solved by doing some quick firefighting that mostly relies on the expertise of senior staff, doesn’t mean this lucky streak will go on forever.

The current mismatch between proven IPM solutions utilising AIOps capabilities that deliver proper workload forecasting, and the stubbornness of enterprises unwilling to turn their IT infrastructure into a model of transparency, can only be solved by enabling IT staff with the necessary insight into the end-to-end IT stack. Enterprises that neglect IPM are at risk of learning the value of true transparency the hard way – in the event of disaster replacement or outsource of those responsible is always an option.

Louise Dilley, Regional Services Director EMEA, Virtual Instruments

“Making connections” – How IOT-based technology transforms local government

Fri, 07/05/2019 - 04:00

The digital revolution has transformed how assets are managed and services are delivered for decades, and the rate of change is constantly increasing. This transformation represents an exciting opportunity for asset and service managers to take advantage of new technologies to increase their efficiency and improve the quality of their service delivery. And nowhere is this opportunity more exciting or the need for change more urgent than across local government.

Scoping the challenge

Resources are tight, and funding limited.  At the same time, residents are getting more demanding. Council tax payers have a base expectation of the core services that will be met by councils within annual budgets. They also expect authorities to address economic, social and environmental issues in the communities they serve. 

By the same token, local authorities are acutely aware of the needs of residents, the cost of delivering service and support, and the limited resources at their disposal to do this.

Unfortunately, funding issues can lead to protracted debates both at a public level and internally within councils - and the scarcity of resources can lead to a focus on competition rather than collaboration, with departments fighting for resources. This can in turn make it difficult to bring people together to find solutions to challenges and to achieve outcomes that benefit the community. And it’s a scenario that mitigates against the kind of agile, efficient service delivery that brings tangible benefits to residents.  

These are tough challenges for local authorities to overcome and they are made tougher still by the legacy applications that they often still use to manage assets. Many applications simply cannot cut it in today’s digital world due to antiquated architectures and slow performance.

Change is required if communities are to remain resilient. Digital transformation will be key but for most authorities, their journey has a long way to go. 

Driving the digital revolution

Given the speed at which the technological revolution is moving, councils need to direct more resources towards digital transformation with a view to increasing efficiency and promoting technological innovation

Data-driven decision-making is about optimising local government performance, and if we can achieve this we’ll create efficiencies that free up valuable resources to be used in the areas where it is most needed. This approach requires us to stop separating workflows, assets, people and resources into separate categories, but instead seek to connect them. The same applies to the various council departments – instead of taking a siloed approach to asset management and service delivery, we should be looking for synergy and alignment.

In this context, one of the biggest opportunities for local authorities is presented by the emergence of the Internet of Things. IOT-connected devices and sensor and control elements that are connected to their operational software platforms can give councils the power to remotely control, monitor the condition of their assets, issue work orders, track workflow, and optimise resource usage. 

There are a raft of possible applications for IOT across council operations. Many started the ball rolling a few years ago by placing sensors on street lights. Utilising data or communication networks, they focus on getting these assets to self-report to new central management systems (CMSs) installed at their headquarters.

It’s an approach that has the potential to turn asset maintenance and management into a more proactive process for the councils. An alert can be sent over the network to the CMS when a bulb blows, providing the council with an immediate update on the situation rather than having to wait for a monthly inspection or rely on a member of the public to report the problem. The result is that the council can be more energy-efficient and provide a better service.

One of the biggest potential benefits of IOT is the opportunity to achieve reduced cost of service delivery. Sensors strategically placed in litter or commercial waste bins could allow councils to develop more efficient routing and collections and even review, based on fill levels, whether bins are really required in certain locations.  Operational efficiencies could be achieved by using sensors on bin lorries to monitor potholes and road conditions while waste collection teams go about their daily operations rather than a separate team having to be sent out to investigate or relying on members of the public to report issues.

Sensor-based technology can also enable councils to make safety improvements. They could, for example, be used to measure the lean of a dangerous tree and alert the authorities to any worsening of it. Moreover, they could be deployed to measure water or silt levels, and give early warning about potential flooding, enabling the council to take proactive action to resolve the problem rather than waiting until an incident occurs and infrastructure assets and the public, are at risk.

Added to this, we see future opportunities to use IOT-connected devices to measure air quality especially with a growing number of low emissions zones now being introduced across the UK.

Sensors can also help improve decision-making. Collecting all this data across IoT networks puts councils in a good position to make more informed decisions about their infrastructure and how to manage it in an economic and cost-effective way.

Building consensus

IOT-based technology is key to the success of digital transformation within public sector organisations and local government, in particular. Technology alone can never be a panacea of course.

To be a success over the long-term, it is also important that digital transformation is a truly organisation-wide initiative. Its success depends on organisations taking their staff along with them on the journey, training them to understand the technology and ensuring they appreciate why the organisation is changing its approach. But it is also key that initiatives not only have senior management buy-in but are driven from the top. That combination of organisation-wide engagement and technology implementation is key to digital transformation success across the public sector today. Getting all this right is challenging of course, but for those councils, authorities and organisations that rise to the challenge, the benefits are potentially great both for themselves and for the public they serve.

Phil Oldbury, Director of Customer Service, Yotta

Fujitsu supercharges healthcare AI

Thu, 07/04/2019 - 09:17

Fujitsu has revealed a new AI system which could lead to patient wait and processing times being slashed.

The new system is able to process medical notes and health records in much faster time than a human eye - achieving more than 90 percent improvement in terms of time saved.

It may be a staple of medical comedy that doctors have traditionally bad handwriting, but Fujitsu says that its system can examine and extract information in less than a minute, compared to the 15 minutes normally required for manual analysis.

With medical professionals increasingly stretched for time when dealing with multiple patients, this technology could greatly improve patient care quality, but also allowing for more patient data to be stored for more in-depth diagnosis.

“Our co-creation strategy with partners such as the San Carlos Clinical Hospital has provided us with an important insight into the challenges faced by the healthcare sector, particularly in terms of supporting clinical decision-making," said Fujitsu Laboratories of Europe’s Chief Executive Officer Dr Adel Rouz. We have succeeded in creating a number of important innovations that are already making a difference to medical professionals’ workflow."

"This latest advance is another step, helping to improve the accuracy of clinical data and automate its digitalisation for hospitals, medical insurance companies and government agencies. We believe that our technology has wider applications and can easily be adapted to solve similar challenges in other domains, such as insurance, legal and compliance.”

Fujitsu says it is now planning to roll out the system to wider institutions later this year.

Amazon hiring 2,000 more UK workers

Thu, 07/04/2019 - 07:30

Amazon is expanding its operations in the UK and is, to that goal, looking to hire a few thousand new employees.

Bloomberg reports that among the 2,000 new workers the tech and retail giant is looking for, are engineers, software developers, as well as data scientists.

Ten per cent of those jobs will be located in Amazon’s development centres in Cambridge, Edinburgh and London, it was said. It is in these centres that Amazon develops different technologies, like Alexa, Prime Air (its drone delivery service) and others.

“We are delighted to be able continue to invest and grow our U.K. business,” Doug Gurr, head of Amazon’s UK operations, said in a statement. “The U.K. is a fantastic hub for global talent and the exciting, innovative work that takes place here benefits Amazon’s customers around the world.”

In late February this year, Amazon added 1,000 apprenticeships in the UK, with workers being in different positions, including corporate and operations sites. They were located in London, but also in development centres in Edinburgh.

Late last year, in October, Amazon added yet another 1,000 jobs in the UK, with Manchester getting its first Amazon office back then.

Roughly 600 positions went to either corporate or development jobs, with 250 positions in Edinburgh’s development centre and 180 positions in Cambridge.

“These are Silicon Valley jobs in Britain, and further cement our long-term commitment to the UK,” Gurr said back then.

Top technological trends for 2019 and beyond

Thu, 07/04/2019 - 07:00

Technologies are evolving, innovations around technology is changing, and connectivity and interactions with the world is accelerating. Since 2018 was a year driven by technology, it is only obvious that this trend will continue to rise further in 2019 as well. Right from artificial intelligence inspired devices to blockchain and machine learning, 2018 witnessed an exponential growth in technology and tech-based careers. By constantly learning to keep up with the current trends of technology, the IT workforce is keeping an eye on the future of technology to enable the growth of businesses. So, if a businessman wishes to reap the maximum benefits from evolving trends with the help of IT professionals, it is essential that entrepreneurs and business owners update themselves with the trends as well. Here’s a quick look at the top technological trends that is said to govern 2019…

Internet of Things

Internet of Things (IoT) is the process wherein the internet connectivity is extended into physical devices like vending machines, alarm clocks, cars, etc. Although IoT has been around for several years, it is yet to develop fully, reason why this technology will continue to trend in 2019 as well. As a result of IoT, business models have turned more innovative as they are now able to improve customer service, speed up processes, enable better safety measures, and offer predictive maintenance and efficient decision making. Due to the many benefits offered by IoT, in the year 2017 alone, over 8.4 billion devices worldwide had an extended internet connectivity; a number that is predicted to touch 30 million by the year 2020. So, if your business has not yet made the smart move of switching to IoT, 2019 is the year to do so.

Automation

With advancements in internet connectivity and machine learning, automation has become a norm in performing several business tasks. With minimal human assistance, automation has replaced traditional working methods in companies as they furnish a more detailed report within a matter of minutes, as opposed to gruesome hours of manual labour. The process of managing and interpreting large volumes of data and tasks has been simplified massively, thus improving the overall efficiency of the business world. As of today, the most widely used automation process includes Robotic Process Automation (RPA), which makes use of a software to simplify traditional processes and ultimately replace human labour with an efficient software.

AI, RPA, and Machine Learning

RPA technology has automated business processes by analysing data, processing transactions, interpreting applications, replying to emails, etc. As a result, over 45 per cent of the menial, repetitive, and tedious activities have been automated which is not just resulting in plenty of career opportunities, but also assisting in the evolution of industries.  

AI (Artificial Intelligence) has generated a lot of buzz in recent years as it has become a part of the daily human life. Mimicking human intelligence, AI is now able to perform several tasks such as speech patterns, recognition of images, decision making, etc. And although the world of AI has to develop a lot further to become entirely automatic, businesses have learnt to adopt AI into their routine tasks such as programming, testing, support, maintenance, etc.

Machine Learning is a study of statistical models and algorithms where computers are able to discover and analyse patterns and get insights from fed data. A subset of AI, machine learning has been deployed in almost all sectors and this market is expected to grow to over $8.81 billion by the year 2021. Be it pattern recognition, data mining, or data analytics, machine learning has become the one-stop shop for all major tasks.

Data and cybersecurity

Since most tasks are automated and large amounts of data is present online, security become a prime concern. As a result, information security has become a rising trend that prevents hacking, data corruption, unauthorised access to websites and databases, alongside implementing adequate digital privacy measures.  Apart from data, cybersecurity has also come to the forefront. In order to prevent any form of illegal access to data by the hands of hackers, implementation of tough security measures is on the rise. While cloud technology and deep learning are some of the top advancements in cyber security, there is no denying that as long as hackers exist, the need to evolve and improve cybersecurity measures will always remain a priority.

Augmented and virtual reality

Although it is widely believed that virtual reality (VR) is used only in the gaming world, recent developments in technology have revealed that VR can be used for much more than that. As of this day, the potential of VR in providing education, training, entertainment, and rehabilitation is ample. Be it for the purpose of training doctors to perform a surgery, or to provide entertainment at theme parks, the market and scope of VR is tremendous. As a result, several industries have realised the benefits of VR and augmented reality and have implemented it in sectors such as healthcare, engineering, manufacturing, space exploration, etc.

Blockchain

Blockchain technology—widely linked to cryptocurrencies, is highly resistant to modification of data, thus making it one of the most secure mediums of protecting information and data online. Blockchain is one of the fastest growing technologies of the 21st century where companies are using it to create a centralised, reliable, and secure system to not just safeguard their sensitive data, but to also record and verify all forms of transactions that takes place within the company.

Hybrid cloud

Cloud computing has become the core of businesses and is continuing to grow at a rapid pace. But the growing concerns over cloud’s security has led to the development of a Hybrid Cloud model where a company can mix the existing on-premises infrastructure with private and public cloud services. This will not just create a more flexible and efficient model, but also offer higher agility to companies, while experiencing fewer outages and lesser downtime. Currently, several companies are already making use of cloud service providers like Microsoft Azure, Google Cloud, and Amazon Web Services, and this usage is only said to increase in the coming years.

Smart spaces

Smart spaces is the latest technological trend that has its roots intertwined with IoT. Right from smart speakers, smart phones, to smart watches, this technology has become a part and parcel of our daily lives. Enabling the interaction between humans and AI, a more automated and interactive environment is being created that is all set to target the businesses and industries in 2019.

Technology and internet has become the centre of attention for all businesses and industries worldwide, and 2018 was proof! With a list of trends emerging the previous year, the probability of further growth is simply undeniable. So, as an entrepreneur or an established businessman, you need to broaden your thinking, explore technologies and implement the ones that will aid the growth of your company even further.

John Tie, Digital Marketer and content strategist, Virtual Employee

Broadcom reportedly in talks to buy Symantec

Thu, 07/04/2019 - 07:00

Broadcom is looking to buy Symantec, but nothing is yet official. This is the highlight of Bloomberg's new report, which cites unnamed sources familiar with the matter.

Apparently, the deal is not yet public and could fall through at any time, which is why both companies are silent on the matter. We don't know yet how much money is at stake here, but Financial Times seems to believe it could be close to the $15bn mark.

What we do know for certain, however, is that Symantec's shares have been trading yesterday at prices the company hasn't seen in eight months.

On Wednesday, the company's shares rose 16 per cent, hitting $22.10. At the same time, Broadcom's shares fell 3.5 per cent, closing at $295.33 on Tuesday.

Broadcom seems to be serious about its idea to move further into software. This could be its second large acquisition, after it recently bought CA Technologies, a move which not everyone greeted kindly.

Symantec would make a perfect fit for the Broadcom portfolio,” Harsh Kumar, an analyst at Piper Jaffray wrote in a note to investors. According to him, the situation is similar to when Broadcom bought CA, “which ultimately turned out to be extremely successful under the Broadcom umbrella.”

Symantec was founded in 1982 and is headquartered in Silicon Valley. It offers a wide array of cybersecurity solutions, most of which are for corporate computer networks. It recently bought LifeLock, a company specialising in identity theft prevention, for $2.3bn.

 

Kaspersky tracks down major new ransomware

Thu, 07/04/2019 - 06:30

Cybersecurity researchers from Kaspersky have discovered a new type of ransomware, and this one seems to be more dangerous than any of its predecessors for one key reason.

The ransomware, named Sodin, takes advantage of a zero-day vulnerability in the Windows operating system, which means that victims don’t even need to download and run a malicious attachment (which was typically essential for the success of a ransomware campaign).

Instead, all they need to do is find a vulnerable server and send a command to download a malicious file called “radm.exe.” This then saved the ransomware locally and executed it.

The Windows vulnerability is now known as CVE-2018-8453.

Sodin also uses what’s known as the “Heaven’s Gate” technique, which allows the malicious program to execute 64-bit code from a 32-bit running process. Kaspersky claims this doesn’t happen often in ransomware. This makes the ransomware harder to detect, as well as harder to analyse.

Most Sodin targets are in the Asia region: 17.6 per cent of attacks went to Taiwan, 9.8 per cent to Hong Kong and 8.8 per cent in South Korea. However, Kaspersky says Europe, North and Latin America weren’t spared.

Each victim was asked to pay $2500 USD worth of bitcoin.

“Ransomware is a very popular type of malware, yet it’s not often that we see such an elaborate and sophisticated version: using the CPU architecture to fly under the radar is not a common practice for encryptors. We expect a rise in the number of attacks involving the Sodin encryptor, since the amount of resources that are required to build such malware is significant.  Those who invested in the malware’s development definitely expect if to pay off handsomely.” – said Fedor Sinitsyn, a security researcher at Kaspersky.

How messaging apps are raising the stakes with slack

Thu, 07/04/2019 - 06:30

Chat and messaging apps have been dubbed "email killers" on numerous occasions, but it seems as the business world has room for both. However, now we have different chat apps battling for the attention of employees everywhere.

What are your thoughts on messaging apps like Facebook Workplace and Microsoft Office launching new features to compete with Slack?

Facebook Workplace is best suited for global communication and was originally designed for general social context, whereas Slack’s platform has been noticed for its adoption coming from small teams collaborating with each other seamlessly, with the help of various features such as document sharing, app integrations etc. Facebook quickly recognised these advances and worked on improving their own document management capabilities immensely by adding native Google and Microsoft file pickers. Employees can now find specific content and upload files effortlessly through Facebook Workplace due to its new integrations with sharing platforms like OneDrive, SharePoint, DropBox and Google Drive. Facebook Workplace also added a feature that allows employees to attach documents and insert comments simultaneously.

Microsoft Teams is another messaging platform that has been innovative with their features lately. Microsoft included Teams into their Office 365 bundle, which gives consumers access to various Microsoft applications all on one platform. Teams is looking to leverage this integration process by adding more applications into the Microsoft suite such as Skype, Sharepoint, etc. Their chat feature allows users to communicate with users who utilise Skype for business engagement as well. In addition, Teams is looking to target specific verticals with a pre-created and tailored environment. To support this, Teams is rolling out ‘team templates’ for the healthcare, retail and education industries. These templates are pre-created with necessary channels, apps and workspace in order to optimise that specific industry. As we can see, messaging app platforms are now becoming much more prominent in the overall user experience than ever before.


From your perspective, what do you think are the best practices and features enterprise platforms should implement in order to boost employee productivity, streamline collaboration and stand out amongst the niche, competitive market?

It depends on the industry, enterprise platforms may work better in some areas than others. Slack has been very popular with desk-based and tech savvy users. Teams is a good choice for enterprises that rely heavily on Microsoft systems. Industries such as retail, hospitality, etc. have a large non-desk workforce and have not seen much traction utilising messaging app platforms. Facebook Workplace has seen better success with the frontline workforce because of its well-known and simple user experience platform.

Slack has struggled as a communication channel for enterprise with wide communication, yet been the most effective at a team collaboration level, while Facebook Workplace faces the opposite challenge. Enterprise platforms will need to evolve to cater specific targeted audiences and offer a wide variety of features that every type of user can benefit from.

Enterprise platforms also need to invest in artificial intelligence and machine learning to enable users to become more productive. Slackbot is a beneficial tool that reminds you to complete certain tasks, automatic post reminders in channels or individuals before an event begins. Microsoft Team’s strategically implemented an approach of providing industry-specific templates to help businesses adapt to the platform quicker. This allows businesses to test out which template works best for their team and discover beneficial features that can help employers become more organised, streamlined and collaborative.

Other essential assets and features enterprise platforms should implement are the following:

Conversation Style Platform: The conversational style of messaging apps naturally facilitates casual communication style amongst employees and customers if businesses have that type of relationship with them.

Search: Both Slack and Microsoft Teams offer powerful full text search which makes it easy for users to find any documents, posts, files, etc. more quickly.

#Tag,@mentions: Feature such as @mentions makes it easy for users to notify others and #tags make it easy to organise content which allows them easier to search for.

Integrations: One of the most critical features is integrations with systems and tools that are used by the users of the messaging platform. This includes document management tools, CRM systems, etc. which enables users to take action through the 3rd party systems, directly from the messaging platform or receive any updates from these systems. This type of integration makes it easier for users to stay on top of everything and contextually collaborate while executing their daily tasks.

Task and To-Do: These platform support enables users to quickly create and assign tasks amongst team members. Push notifications on mobile and desktop enable timely reminders.

What do you predict the next wave of features to be as messaging apps continue to develop helpful tools for businesses? 

The next wave of features messaging platforms will implement is to continue integration of enterprise systems. While this is already possible through an API based approach, messaging platforms are looking to provide a UI framework to enable enterprises to write custom lightweight applications. These lightweight applications are single purpose apps that enable users to perform a specific task. Slack recently introduced Block kit framework which enables developers/enterprise to build apps on Slack, and Microsoft Teams introduced Adaptive cards. The micro-app concept is gaining a lot of traction and will be the core differentiator across all messaging platforms in the near future.

Praveen Kandyadi, Cofounder and VP of Product, Groupe.io

Connected cars and the future of transport

Thu, 07/04/2019 - 06:00

Ankur Bhan, Head of WING at Nokia, discusses the future of connected cars, security of IoT devices and the evolution of Nokia’s WING business.

How has the WING business evolved since its inception?

Since its 2017 inception, the basic WING business model remains relatively unchanged. It’s a pay-as-you-grow, managed service built on a cloud-native, globally distributed core, optimized to deliver best-in-class IoT services. It enables Communication Service Providers (CSPs) to differentiate themselves by delivering to their enterprise customers a fast, reliable and scalable service to all corners of the globe.

In 2019, we expanded the business to include four market-ready, pre-packaged IoT vertical solutions to serve as a catalyst for CSPs to grow their IoT businesses. These solutions included smart agriculture, livestock management, logistics and asset management. This expansion did not change our fundamental business model, but it did accelerate for our CSP customers their time-to-market with IoT services for their enterprise customers.

In the future, the cloud-native, globally distributed nature of WING will support further expansion into critical areas such as IoT edge. Use case demands for lower latency, the volume and velocity of data as IoT scales, and data governance regulation will drive data processing and real-time control increasingly closer to devices. This is the harbinger to some of the compelling use cases envisioned for 5G.

What are the most common connectivity demands you are seeing from your automotive partners? Has this changed at all within the last year or so?

For clarity, we sell WING to CSPs, who, in turn, offer connectivity services to automotive OEMs. The WING business model is a white-label, managed service to CSPs, who then brand and sell the service to enterprises.

For connectivity demands, automotive OEMs want the ability to work with one CSP that will provide SIMs that will work across the globe. They need to maintain an ongoing connection to their vehicles that gives them real-time visibility and management across their entire global footprint. They need a common globe core to support uniform SLAs and a consistent set of operations. They need local breakout to deliver the high-volume, high-bandwidth, low-latency connections that many automotive connected services require. They need local data storage to meet data sovereignty regulations. They need one simple global integration to drive down costs. They get all of this with WING.


What are the major challenges affecting IoT deployments today?

IoT is global in nature. More than 70 per cent of IoT projects cross national borders; yet CSP networks are bound by geographic constraints. A fundamental challenge for CSPs is to offer IoT services that seamlessly cross geographic borders and networks and empower enterprises to optimally deliver and manage borderless IoT services.

CSPs do not have the resources to build and manage a global IoT network. They need a partner that lessens the upfront investments needed to scale to mass market and to go global. With its global network and pay-as-you-grow model, WING is that right partner to rapidly drive market success.

Finally, when services need to cross national and geographic borders, ensuring data regulatory compliance is a key and escalating challenge.  Data privacy laws are evolving and differ across the global, with a growing regulatory demand to keep user data local.

How do you ensure you keep your connectivity safe and secure from threats?

The WING infrastructure is hosted in secure public data centres with the highest level of physical and virtual security possible. Our architecture integrates Nokia’s NetGuard security platform, which provides a comprehensive security backbone to eliminate any possible threats. The WING IoT Command Centre provides 24/7 monitoring, 365 days a year.

How do you see your work changing in the next year or so?

As more and more CSPs are onboarded into the WING ecosystem, an increasing amount of effort will be focused on helping CSPs build their pipeline of enterprise customers. WING is currently compatible with a broad range of access network technologies, but as 5G begins to scale next year, it will unlock a new set of IoT services (e.g., autonomous vehicles, Industrial Age 4.0, video-as-a-sensor) with more demanding connectivity requirements. Our efforts will be focused on meeting those more stringent demands head on.


Do you think connected cars can act as a useful testbed for other IoT-connected devices in the future?

Connected cars will be at the forefront of the IoT revolution, transforming the driving experience for consumers and the entire automotive industry. It will enable a rich set of compelling IoT use cases, including the ultimate IoT device, the fully autonomous vehicle. As such, the connected car will boldly demonstrate the tangible ways that IoT will impact our lives, including safety, convenience and efficiencies. It will provide one of the most fertile testbeds for the possibilities of IoT.

What more needs to be done to ensure more IoT devices land in common usage?

From an enterprise perspective, more work needs to be done to align IoT technology and devices with tangible business needs and value (ROI), whilst helping enterprise decision makers to visualise what is possible when entering the IoT market and its revenue streams. 

From a consumer perspective, cost is the main driver. IoT devices will need to demonstrate value. B2C enterprises will need to provide simple ‘plug-and-play’ devices whilst keeping initial and recurring costs as low and affordable as possible.

Ankur Bhan, Head of WING, Nokia

Many companies still don't feel comfortable handling customer data

Thu, 07/04/2019 - 06:00

For a company to be considered ‘master of its data’, it needs to be able to do four things: manage, secure, gain insights and use the data responsibly. A large majority of organisations in the EMEA region aren’t confident they can do that properly.

This is according to a new report from Oracle which sheds light on a couple of important points. Almost half of organisations don’t have a data management strategy set up, while just a third (35 per cent) are confident they can manage data and generate ‘meaningful insights’.

Key departments are still not considered both accountable and responsible for data management, while data security protocols remain ignored, or misunderstood. The report claims that almost half of all finance and IT decision makers believe they are accountable for data security, while among marketing and HR (which usually use this data), just a third feels the same way.

“We know that being able to leverage data gives immense business benefit and a lead that others find hard to diminish,” said Andrew Sutherland, Senior Vice President, Technology and Systems, Oracle APAC and EMEA.

“But these findings suggest that organisations are still being overwhelmed by the data deluge faced. Companies need to tackle the problem head on. This will come from better internal practices and putting data management strategies and enhanced security controls in place.  Additionally, the prudent use of cloud and emerging technologies like AI and automation will also be key as we hit that tipping point where the data and security challenge is becoming just too big for humans alone.”

The biggest fears about data security within organisations is low attention to data confidentiality, together with weak data access controls. There are also issues with how data is handled through mobile devices and social platforms.

 

Can increased trust in employers help society win the war on fake news?

Thu, 07/04/2019 - 05:30

Whom to trust is an increasingly complex issue and separating fact from fiction, in an age when anyone can mass produce and share information, is getting harder. At the same time, trust   continues to be shaky. Today people are most trusting of others they believe to be like themselves and those they have a ‘close’ relationship with.

Interestingly, one traditional institution that is gaining people’s trust faster than any other is ‘My Employer’. This was a key finding in the 2019 Edelman Trust Barometer study and it offers huge potential for HR professionals looking to develop greater engagement and cultural cohesion in their workplaces, plus improved media literacy.

Trusting the ‘hand the feeds you’

Globally people have more trust in their employers than in any other single institution, with trust levels running at 75 per cent. This is 19 points more than corporate business in general and 27 points more than government. Employers are in a position to really positively influence their workforces.

At the same time, we know that people are also more worried about misinformation and fake news - content created to be deliberately misleading or to be used in a misleading way - than ever before. Globally, 55 per cent of people surveyed in the 2019 edition of the Reuters Institute Digital News Report from over 35 countries said they were concerned about misinformation.  According to the Edelman 2019 Trust 2019 survey, 73 per cent are worried about fake news being used as a weapon to influence and people are encountering roadblocks in their quest for facts.

In the UK, concerns about fake news are particularly high. Unsurprising given the ambivalence around Brexit. 70 per cent of Reuters Institute survey respondents agreed with the statement "I am concerned about what is real and what is fake on the internet", up 12 per cent on the results from a year ago. 

Combined, these findings highlight that if employers really want to build engagement and a strong culture, they already have the trust of their workforces as a starting point. And when it comes to education about fake news, they will have their employees’ attention.

How can you be supporting employees to navigate the world of fake news? And how do you guide them towards accurate and informed sources when they turn to Google for research?

Here are four aspects of the media in context of wider society that every employee should be aware of.

Becoming media literate

The Ofcom 2019 Media Literacy study verifies that UK adults totally rely on their phones to manage their lives and they are not as savvy when it comes to the information they consume as you might expect. There has also been little change in critical awareness in the past few years, with many still lacking the critical skills needed to identify when they are being advertised to online. Over 40 per cent of people are not aware that YouTube is funded through advertising when they view video content online and 50 per cent of consumers did not appreciate that Google’s revenues are generated through search advertising.

One in three adults never uses a computer to go online and one in ten only uses a smartphone, up since 2017, with social media becoming a key source of information. Compared to 2017, internet users are more likely to have encountered hateful content online, however around a third don’t do anything about it – unsure what to do or whether it will make a difference.

Does the truth exist?

The question of whether the truth actually exists has been a topic of debate for many years and now we have the notion of ‘post truth’, based around the idea that people will believe information if it resonates with their emotions and beliefs. Rather than weighing up the facts, they will accept as ‘the truth’ information that appeals to their prejudices and sensibilities, hence why fake news is has so much potential to influence.  In a post truth era, we believe what we want to believe.

Blatantly fake news vs. ‘alternative facts’

72 per cent of people say they consume news information at least weekly and they share or post content at least several times a month. Now in response to the threat of fake news, they say they are turning to more "reputable" sources, but do they know what that looks like?

Deciding what constitutes a reputable news source is frequently left to people to determine by themselves. It’s a difficult task when campaigns like “Fishwrap” are freshening up old news and recycling it as new.  Anonymous fakers are using increasingly sophisticated techniques to publish misleading information and influence voters, for instance, using fresh stories that link back to old sources running on anonymous servers.

Even Donald Trump is regularly linked to fake news and his inauguration was marked by the release of ‘alternative facts’ concerning how many people attended the actual ceremony. And what’s the difference between an alternative fact and fake news? Given the position of trust they hold, employers can be playing an important educational role here, to determine what’s real, fake or just an ‘alternative’ version of the truth. 

Seeing is believing, or is it?

We’ve all seen those memes and videos, many of them totally implausible. Sharks in swimming pools, monkeys playing the guitar. They are harmless and obvious fakes but many images are used to deliberately mislead and influence. Teach your employees about easy ways to verify authenticity. A simple check to test whether an image is genuine is to right click on anything suspicious and use Google Reverse Image Search to confirm its source and production date.

The darker side to personalisation

Daily Mail or Sunday Times reader? Most people have a distinct preference when it comes to the places they go to for news and tend to use sources that are aligned to their beliefs. But what constitutes ‘news’ today goes far beyond just confirming our unconscious biases. We are all fed information that is consistent with our user profiles and search histories. This means it’s too easy for the unsuspecting to be influenced by fake news.  The media has proliferated to such a degree that we are being targeted with the messages we want to believe and it’s easy to live in our own personal informational worlds, or ‘bubbles’ as Barak Obama describes them.

The best fake news is incredibly real

There have always been people with kooky ideas and most of the time it’s pretty harmless ‘crazy uncle’ type content, but that’s the danger with fake news. The best examples are designed to be something plausible that you want to be true and when you are only being fed information that matches your beliefs, it’s even harder to be objective. It’s akin to propaganda on a grand scale and often being done covertly, micro targeted to such a degree that people don’t even realise it.

Social media in particular plays a key role in the dissemination of fake news and for many people who access the Internet via a smartphone only, it’s their primary source of information. These users are now less likely than in 2017 to see views they disagree with on social media and 25 per cent of respondents stated they ‘rarely’ see alternative viewpoints to have beliefs challenged.

Employers have a duty of care when it comes to media literacy and helping their employees to become more discerning about online information and being in such a trusted position demonstrates they have the means to do so.

Stephen Humphreys, general manager, GoodHabitz

How to keep perishable goods in optimal condition with remote temperature monitoring

Thu, 07/04/2019 - 05:00

Perishable goods are problematic to transport over long distances. They need to be kept under specific temperature and humidity conditions at all times, and their short shelf-lives mean that their value reduces over time, even if they are handled correctly throughout the supply chain. The main priority for any manager responsible for this process is ensuring the integrity of temperature-sensitive products. Operating a fleet of vehicles is challenging enough on its own, but shipping cargo of a delicate nature can turn it into a logistical nightmare. Field managers have to be meticulous throughout the delivery process to ensure the safety of perishable goods and that every shipment arrives in perfect condition.

Fleet management systems are by far one of the most reliable solutions on the market to safely transport all types of delicate and perishable goods. These sophisticated systems integrate GPS tracking devices with IoT-enabled sensors, offering around-the-clock remote temperature and humidity monitoring. Let’s take a quick look at some of the challenges associated with transporting perishable goods and the tools offered by fleet management systems to overcome them.

What are perishable goods?

To avoid needless mistakes when planning a delivery, it’s important to first distinguish which products are perishable. Most packaged foods and products come with guidelines on how they should be stored, and it’s always worth checking that your transportation or delivery conditions meet these. Perishable goods are foodstuffs and other products that have short shelf-lives and tend to spoil, decay or deteriorate unless kept under specific temperature and humidity conditions. For example, food products such as meat, dairy, poultry, fish, vegetables, fruit and pre-cooked foods must be kept at low temperatures, otherwise, mould or harmful bacteria can develop. As well as harming the odour, taste and/or visual appeal of the food product, bacterial growth and decay eventually make it unsafe to eat. Other perishable goods include certain chemicals, pharmaceuticals, vaccines, and even flowers. These products are susceptible to severe deterioration during transportation due to their delicate or volatile nature.

Things to know when transporting perishable products

When transporting perishable goods, it’s essential to ensure the shipping conditions are correct. A simple oversight could spoil the entire shipment, which is both financially painful and a source of excessive downtime. Maintaining the correct temperature during transportation is by far the biggest factor in retaining the quality of perishable goods until delivery. However, monitoring the temperature of a cargo hold is challenging. For decades, companies only had rudimentary methods to monitor cargo, such as manual inspection by the driver at each delivery or even mid-delivery for long-haul trips. Naturally, this caused undue downtime but was considered necessary to avoid the all-too-familiar disaster of a shipment arriving in a completely ruined state.

Integrated temperature and humidity monitoring

Fortunately for the companies and drivers transporting perishable goods, modern GPS tracking devices can be integrated with advanced sensors to monitor the condition of commercial vehicles and their cargo holds. This means they can do much more than just track location; they can also accurately measure and track the temperature and humidity levels in the cargo hold around the clock. This information is sent to a remote server that can be accessed on-demand, offering full awareness of the condition of transported goods. Field managers can set up event alerts with pre-determined temperature and humidity limits, which will trigger an automated warning to the vehicle’s driver and responsible supervisor if surpassed. What’s more, the system will send an alert if it detects abnormalities such as fluctuating temperature or humidity levels, which can indicate a malfunction.

Role of GPS tracking devices

So, what makes GPS tracking devices better than alternative solutions on the market for remote temperature monitoring? One reason is that vehicle tracking systems can be used with any type of vehicle or trailer, as both portable and hardwired tracking devices can be integrated with IoT-enabled sensors. However, the key reason is that real-time and accurate monitoring of the condition and location of vehicles in a fleet has been proven to significantly optimise fleet operations, leading to reduced downtime and reduced paperwork.

Optimised operations

Ensuring that each shipment arrives on time and in excellent condition can seem like an impossible task for fleets with hundreds or thousands of vehicles, but sophisticated fleet management systems simplify the entire process. The sensors integrated into the tracking devices allow field managers to ensure that the cargo hold of every vehicle in the fleet is optimised for the delicate cargo inside. Changes in temperature, engine ignition, start/stop locations and many other factors that could affect the condition of the goods in the cargo hold can be recorded, viewed and analysed, helping managers make informed decisions for future deliveries and improve business operations even further.

Reduced downtime

When there’s a problem with a vehicle, identifying the problem is the first step. The vehicle tracking device will send an automated warning by an instant alert if the cargo hold conditions change unexpectedly or the vehicle breaks down. If the issue can’t be rectified, the cargo may have to be transferred to another truck to ensure its integrity. In such situations, having accurate location data to hand is invaluable; the manager can instantly locate the malfunctioning vehicle, as well as any vehicles nearby, and contact each driver remotely to update them with new instructions and destinations, saving a considerable amount of downtime.

Reduced paperwork

Another important aspect of delivering perishable products is the increasing demand from customers for proof-of-service, which entails a specific set of information such as temperature and humidity levels throughout transportation. Fleet management systems provide accurate and up-to-date information about the condition of the products, and can produce reports and delivery logs automatically, leaving drivers and managers with the precious extra time that was once spent on paperwork.

Products ranging from vegetables to vaccines all need to be monitored and shipped in controlled environments; that is why the introduction of vehicle tracking devices, IoT-enabled sensors and fleet management systems has been revolutionary for their transportation. Since companies started using these advanced systems, perishable goods have become safer to consume and use without worrying about health concerns. Fleet management systems are essential for any company that transports delicate or perishable goods and wants higher quality deliveries and optimised fleet operations with less downtime and paperwork.

Ekim Saribardak, Rewire Security

Reasons for the hybrid cloud: Disaster recovery and cost

Thu, 07/04/2019 - 04:30

Hybrid cloud has evolved to becoming a major differentiator in today’s technology landscape. Whilst security was often touted as a major challenge for those businesses considering cloud adoption, the hybrid cloud model has ushered in greater agility and security compared to the other cloud models. Whilst hybrid cloud has emerged as the next step in the evolution of cloud computing, what do we mean when we talk about the hybrid cloud? Is it just a varied number of IT solutions offered up by vendors looking to close more sales in a very competitive environment?

The ‘hybrid’ nature usually refers to the location of the cloud, i.e. on-premise or off-premise, whether it is private or shared to some extent and even what type of services are being offered. So, in effect, a hybrid cloud refers to a mixture of on-premises private cloud resources combined with third-party public cloud resources with an accompanying level of management therein. Many businesses will seek to balance their workload mix across both on-premise and off-premise options to leverage a more efficient use of IT resources in order to meet business objectives. For organisations running diverse applications over different legacy systems, hybrid cloud is best placed to manage their complex IT landscape. Enterprises are looking to boost productivity by hosting critical applications in private clouds and applications attracting fewer security concerns than in public clouds providing environments that are secure, cost effective and scalable.

I see two main reasons for businesses to move to the hybrid cloud. One would be for disaster recovery purposes. You want to run your production or your environment in more than one location using more than one provider; one option that allows you to do that is running a hybrid cloud. Whether it's active or passive, you can run your environment in two locations - one in the cloud and one on premise. In this way you achieve full redundancy along with a disaster recovery plan. With hybrid cloud, companies can use the public cloud as a place to replicate their data to, whilst keeping their live data in the private cloud.

Potential compliance problems

Using the hybrid cloud to improve your disaster recovery capabilities really means that you are using cloud disaster recovery, but your live system is in a private cloud. Perhaps one of the key considerations for using hybrid cloud as a disaster recovery solution is that the recovery process is complex at the best of times and failover planning from your live site to a public cloud requires a lot of planning. A hybrid cloud approach is ultimately advantageous to an organisation’s disaster recovery strategy. When you make use of both privately managed infrastructure and the public cloud, then this allows businesses to make the most of both environments. A hybrid cloud disaster recovery can provide flexibility, and an improved user experience.

A hybrid cloud disaster recovery approach helps with all things compliance (think GDPR for example). An organisations that hosts sensitive data in the cloud tends to be at risk of not meeting compliance standards. When you have a third-party, like a cloud provider, having access to your data, this can cause potential compliance problems. If you are using a hybrid cloud approach however, then you can have access to your own private network and you have the power to replicate and encrypt your data within your network before transferring it to a recovery site.

The second reason would be with respect to cost. Whether it's to move a portion of your setup, (the portion which is more expensive) to the data centre, because sometimes there is a specific service that produces a lot of outgoing traffic which is relatively expensive in the cloud, while it's almost zero using the data centre; in this case you can split your environment into a hybrid one, running the costly services via the data centre. If you syndicate hardware and software, then virtualisation can reduce your costs significantly.

Hybrid cloud is about the mindset

And hybrid cloud provides scalability; once you outgrow a server, with the hybrid cloud you can then upscale with your live system. In a hybrid cloud environment, it is possible to obtain additional efficiencies and further reduce the over-provisioning of IT resources while also maintaining the on-premise option. It is now possible to lower overall IT resource consumption and shift those loads to the lowest cost site.  So, in low use periods the lowest cost option would likely be on-premise and in periods of peak loads, it would be a mix of both on-premise and the lowest cost cloud provider.

Indeed, when almost a thousand professionals were asked about their adoption of cloud computing, the results were underlined in the 2018 State of Cloud Survey which revealed that the top cloud challenges facing users were spend and security. The survey revealed that cloud users were aware of wasting money in the cloud and therefore rated cloud optimisation efforts as their top initiative for the coming year. Many businesses overprovision their infrastructure on-premises to be ready to handle unpredictable growth, which further adds to overspending. Therefore, managed consumption for hybrid cloud is an ideal operating model that lets businesses consume the precise amount of cloud resources that they need, wherever their workloads live. And, if managed correctly, this type of model lets businesses see who is using their cloud and what the costs are.

In the continuing evolution of cloud computing, it is clear that hybrid cloud adoption is only going to increase. Gartner  has already predicted  that by 2020, the vast majority of businesses will have adopted a hybrid cloud infrastructure. Clearly technology is a massive consideration when you start seeing the incorporation of automation, machine learning and artificial intelligence into cloud platforms along with the way in which environments will be managed and maintained.

But is hybrid cloud all just about the technology? Of course, technology will be updated and improved but ultimately hybrid cloud is about a mindset. A mindset that is focussed around outcomes that ensure that a business is delivering in terms of costs and objectives.

Yair Green, CTO, GlobalDots

Why fraud detection needs a reboot

Thu, 07/04/2019 - 04:00

Fraud occurs every day across a variety of industries, causing trillions in losses each year. While financial services and banking are among the hardest-hit industries, other frequent targets include retail, health care, information technology, government/public administration and utilities. In some segments, fraud has reached the highest levels on record, affecting more organisations than ever. The pervasiveness of the problem was revealed in a recent survey by PwC. 49 per cent of the businesses contacted by PwC for its 2018 Global Economic Crime and Fraud Survey reported they had experienced fraud and economic crime over a two-year period. But what about the other 51 per cent of organisations? Did they avoid falling victim to fraud or simply didn’t know about it? The survey noted that fraudsters hide in the shadows, exploiting organisations’ lack of visibility into their presence and activities.

Fraud is getting harder to detect

Legacy fraud management platforms have limitations that result in too many false positive alerts to investigate, a condition that enables malicious activities to go undetected. Typically, these platforms produce evidence of activity after fraud has taken place, which is a classic example of too little, too late. A major shortcoming of these platforms is that data fed into their analytics engines are siloed and lack context, which prevents IT from making an accurate assessment of risk. For example, suppose an enterprise is trying to find out if its accounts payable department is making fraudulent payments. If the company focuses exclusively on its payments data sets to detect suspicious or anomalous transactions, it will miss the opportunity to dig into the behaviour of the people authorised to make payments. By analysing behaviour, the company can determine whether an insider or hacker (who has stolen an employee’s credentials) has created a fake account or accounts to which they are sending payments.

Another shortcoming of legacy platforms is their reliance on rules to make a judgment on the legitimacy of transactions. The big problem is that rules are established manually before any activity is assessed. Consider this use case which illustrates the limitations of rule sets: A fund manager for a wealth management company exploits the rule about investing a maximum of £100,000 daily in high-risk stocks. An individual can skirt the rule and avoid detection by investing £99,000 each day. Though this isn’t necessarily a fraudulent activity, it’s still risky activity that management would want to know about. A rules-based system will not detect the activity. Another shortcoming of these platforms is they fail to correlate activities from different channels. A good example of failure occurs in banking. Transactions take place on mobile devices, the web, a credit card, a debit card, ATMs and via face-to-face interactions at local branches. A hacker can create fraudulent accounts and transactions on one system that will not be correlated with activities or behaviours on other systems because the fraud platform is unable to link data that resides in incompatible file systems and formats.

Data analytics and fraud prevention

Recent advances in a range of technologies from big data to machine learning have coalesced to build new approaches to fraud analytics. These can detect anomalous and outlying behaviours and activities in real time and provide accurate risk assessments so that mitigations can be triggered quickly. Here are several elements that are required to implement machine learning-based fraud detection at your company:

Big data store: The first thing you need is an architecture that can scale to millions, even billions of data points over time. A big data system should support large and varied data sets (both structured and unstructured) and enable your data analytics to uncover information, including hidden patterns, unknown correlations and trends. Data sources: Your processing engine should be able to ingest data from all available sources, including online and offline, regardless of its format. More data sources will result in better correlations and insights. Data linkage: The data must be normalised in some way so it can be linked to a specific identity. That identity could be a cashier, a customer service representative, a customer and so on. Likewise, the identity could be an entity, such as a point-of-sale device or a desktop computer. Linkage is essential to the creation of a baseline of behaviour for each identity so that new activities can be compared to the baseline to look for anomalies. A machine learning model: Once you have a big data store, data sources and data linkage established, you need to set up artificial intelligence (AI) and machine learning models that can automatically analyse data feeds, establish baselines and risk score activity without being programmed. This process of learning uses sophisticated algorithms to look for patterns in data, adjust risk scores and make better decisions in the future based on data collected and analysed.

Criminals and hackers are already using advanced technologies, including AI, to harvest information and perform fraud at machine-level speed. To keep pace with attackers, organisations need to consider enhancing legacy rules-based fraud detection with new approaches that use data science to process multidimensional sources of information in ways humans cannot.

Saryu Nayyar, CEO, Gurucul

High performance computing increasingly benefitting UK businesses

Wed, 07/03/2019 - 07:30

IT professionals in the UK perceive high-performance computing (HPC) as increasingly benefitting, with the vast majority considering it a ‘key technology’ for innovation.

This is according to a new report by SUSE, which claims HPC is now a driving force for positive change in the enterprise, and not just science and research.

Government organisations, academia and other industries, can all benefit greatly from HPC. It is also seen as a major ‘competitive differentiator’ for businesses, with most IT staff saying that not implementing a practical HPC application can ‘severely impact’ their competitive advantage within half a decade.  

When it comes to enterprises leading the charge with HPC uptake, Germany stands out as the country with most large organisations on board. In the UK, 43 per cent of the IT pros surveyed confirmed their business is already using a practical application of HPC, with almost half (48 per cent) considering it, as well.

Many organisations are currently in the process of educating and training their staff, to be able to implement HPC solutions.

 “HPC may have its roots in academia or government institutions but a broader spectrum of organisations – from banking and healthcare to retail and utilities – are increasingly turning to HPC to deliver massive computing power. While the historic cost of HPC or “supercomputers” had limited its use to certain market segments, the evolution of both lower cost hardware and Linux has dramatically reduced the price of these systems. With compute power increasing on a scale of one thousand in just a few years, many commercial companies are now able to tap into the power of supercomputers in the form of an HPC Linux cluster – and reap the rewards.” said Matt Eckersall, Regional Director, EMEA West at SUSE. 

Better security means access is not a simple yes/no question

Wed, 07/03/2019 - 07:00

It seems simple: to keep data secure, you need to make sure that the person requesting access is who they say they are, and they have the right to access the data they are requesting.

But, as with everything else, it shouldn’t be so simple—not if you want to get security right. Not all data is equal. Some data should be protected with the strongest security, while other documents are far less critical. And proving identity is also not quite so straightforward—it’s far easier to trust an employee using a company-owned device in the office than one working remotely using an unsecured device.

So as an IT service provider or managed service provider (MSP), how do you strike a balance?

One approach is to make everything highly secure and ensure that every employee requesting access proves who they are without room for doubt. But not only is this time consuming and inefficient, this is how employees end up circumventing security—posing an even bigger danger. Instead, a new approach is needed—one that assesses the risk of each request and demands the appropriate response.

The risk presented

With the lines between work and play blurring, and employees using their work devices for personal use—and vice versa—attempting to protect a business by declaring that particular devices are safe is no longer sufficient.

The level of access that is granted to each individual needs to be based on the level of confidence, or risk, they present to a business, and the level of resource access they require. So, if an employee is accessing the company network using a corporate device that is trusted, we know that that individual is secure—this person presents less risk.

But if this same person was accessing the network from a different device, say a personal one, that the network had never seen before, and from an unfamiliar place—then this person’s level of risk would go up.

The material that an individual is also trying to access needs to be considered. If the material is particularly sensitive, or is outside the regular level of access, then again, the risk increases.

When we think of risk, it’s about assessing whether the individual is who they say they are, and how likely it is that a compromised device is trying to gain access to the network.

Adding pressure

This does mean that when increased risk is present, there is some extra work for the user. Instead of granting automatic access, and potentially allowing an infected machine or unauthorised user to come onto the network, the user could be asked for additional authentication, to prove they are who they say they are.

This approach is something most people see on a day-to-day basis. When you collect a parcel or a package, although you may have an order number, you will be asked to prove your identity with photo ID or a bank card.

It’s an approach that’s widely embraced in the world of mobile banking. While minimal security is needed to look at a bank balance—usually a four or a six-digit code—if a person wants to transfer funds, then an added level of authentication is needed, to ensure protection against fraudulent behaviour.

But while adding pressure may seem like an added inconvenience, it doesn’t need to be if MSPs and IT service providers follow the 80/20 rule—treating 80 per cent of their employees in a similar fashion and treating the ‘risky’ 20 per cent with higher levels of security.

Most employees and users (the 80 per cent) will have the same needs—they will require regular access to certain materials, and restricted access to more sensitive information. The ‘risky’ characters (the 20 per cent) can also be easily identified, as they will be employees that require access to more sensitive information—such as IT administrators, HR and finance staff, and C-level executives.

Applying the 80/20 rule

With this in mind, how do MSPs and IT service providers apply the 80/20 rule, and in which scenarios is more pressure needed? How does an MSP know where their responsibilities end?

Ultimately, there will be certain users where an MSP will need to go further than it has done before, to ensure that they are fully secure. If there is a person within the organisation that can access the crown jewels, then it’s the MSPs responsibility to ensure that anyone trying to get their hands on the jewels isn’t doing it from a device that is dirty, from a network that is compromised, and that a close eye is being kept on their activity.

Let’s put this into practice. The head of HR for an organisation will be able to access data on every single employee within their organisation—and accessing this information from an untrusted, insecure device presents a huge risk. In this instance, an MSP will want to ensure that the device is controlled and that it hasn’t been compromised. It may be that security trumps convenience here, and that the user needs to use a trusted device to access the most sensitive information.

The MSP’s responsibility is to understand the most important and sensitive data about the businesses it serves: the data it holds, the data that needs protecting, the systems that are used to access this data, and the individuals that have access to it. The MSP also needs to create a division between the 80 per cent and 20 per cent staff, as well as identify the crown jewels that need special protection.

Better security, better access

With the rise of remote working, and the increase in cybersecurity threats, businesses today can’t afford to simply grant broad access to every employee in the same way. They need to use the 80/20 rule to appropriately balance risk and security.

MSPs and IT service providers have an important role to play as a trusted partner to businesses, ensuring they are keeping data secure, and giving users the access they need to be able to do their jobs. In order to do this, MSPs need to ensure they fully understand their customers’ businesses and their most precious data. They also need to put processes in place to ensure that trusted employees can access this data and apply pressure in circumstances that could be considered risky.

An MSP ultimately needs to ensure that only royalty can have access to the crown jewels.

Tim Brown, VP Security, SolarWinds MSP

US government staff told to treat Huawei as blacklisted

Wed, 07/03/2019 - 07:00

Even though US president Donald Trump recently vowed to ease the ban on sales to Huawei, the company still needs to be treated as blacklisted by US government institutions, new reports have said.

According to Reuters, , a senior US official has told the enforcement staff over at the Commerce Department that Huawei should still be considered as blacklisted. This was done, allegedly, to clear any confusion employees might have, following Trump’s latest move.

A few days back, Trump met with Chinese president Xi Jinping during the G20 summit in Japan, after which he promised to allow US companies to sell their products to Huawei.

Soon after, John Sonderman, Deputy Director of the Office of Export Enforcement, in the Commerce Department’s Bureau of Industry and Security (BIS), sent an email to make sure agents know how to act.

All such applications should be considered on merit and flagged with language noting that “This party is on the Entity List. Evaluate the associated license review policy under part 744,” he wrote, citing regulations that include the Entity List and the “presumption of denial” licensing policy that is applied to blacklisted companies.

When evaluating any Huawei-related license applications, any further guidance from BIS should be considered, he added.

Huawei was blacklisted by the US president after being deemed a threat to national security. Trump’s latest move is perceived as an olive branch, to continue trade talks with China and potentially even ease up on the ongoing trade war between the two countries.

US tech companies, who see Huawei as an important partner, have mostly praised Trump’s latest move.

Microsoft goes back to the future with Windows 1.0

Wed, 07/03/2019 - 06:30

Microsoft has raised eyebrows across social media with a new nostalgia-tinged teaser.

The company recently published a short clip on Twitter, which basically rolls back the years by showcasing how the Windows logo transformed over time – starting with the Windows 10 logo, all the way back to Windows 1.0.

While cyberpunk-styled grid visuals rolled in the background, synth music straight out of the original Tron movie was playing. The company's Instagram page was also completely wiped, leaving nothing but an ancient photograph showing boxes of Microsoft Word, Excel, Flight Simulator and other Jurassic stuff.  

People on Twitter were quick to jump onto the bandwagon, with comments like “Windows 1.0? Amazing, I can finally upgrade from my Windows 10” or “Looks like it’s time to upgrade”.

However others have speculated that the teaser could be linked to an upcoming release, most likely an open-source version of Windows 1.0 made available online.

Windows 1.0 was Microsoft’s very first version of the operating system. It was a 16-bit platform, which allowed graphical programs to be run over MS-DOS. It was the company’s first foray into a more visual experience, leaving command lines behind them. 

IoT: How bright is the future, really?

Wed, 07/03/2019 - 06:30

Jonathan Luse, General Manager of the IoT Group at Intel, discusses the company’s Internet of Things (IoT) projects, the challenges facing the IoT industry, as well as the overall possibilities of this new and exciting technology.

What are some of the most interesting IoT projects Intel is working on at the moment?

Here are a few examples of partners using Intel technology in IoT:   Agent Vi is a global video analytics software provider whose technology is used to improve security, safety and incident response time in cities using AI.

Intel’s software toolkit OpenVINO, has helped Agent Vi scale its AI solutions across a wide range of applications including public safety and city surveillance, traffic management, waste collection and many more. Specifically, the software has allowed the company to deploy a neural network, using existing cameras, that knows when city street bins are full and alerts someone to empty them.

Another example is RESOLVE’s TrailGuard AI anti-poaching camera. RESOLVE is a non-profit organisation using cameras with Intel-powered AI technology to detect poachers entering Africa’s wildlife reserves and alert park rangers in near real-time to stop poachers.

TrailGuard AI uses Intel Movidius Vision Processing Units (VPUs) for image processing, running deep neural network algorithms for object detection and image classification inside the camera. If humans are detected among any of the motion-activated images captured by the camera, it triggers electronic alerts so the park can mobilise rangers before poachers can do harm.

The technology has been deployed in around 100 reserves in Africa including the Serengeti, with plans to expand to South East Asia and South America as well.

What steps need to be taken to ensure the world's network infrastructure is able to cope with the increasing width of IoT?

The volume of data being generated is enormous, so the first thing we need to do is manage the bandwidth economically. We have to make sure we have the ability to tag and compress that data with relevant information and not send meaningless data up to the cloud for analytics. The first order of business is making sure that we use the network infrastructure properly, and process the right data in the right locations.

The second task is evolving the network infrastructure itself to take advantage of emerging technologies.   Integrating technology such as 5G network infrastructure will give us the ability to expand the network performance, bandwidth all while keeping critical data moving in low latency environments. 

How is Intel planning on encouraging its partners and customers to integrate more IoT hardware?

The first thing we do is ensure it’s easy to activate the technologies. We’ve been investing in a lot of IoT centric technologies in our processors including real time systems, time sensitive networks with deterministic performance, manageability engines and functionally safe devices. 

It’s important for us to work with our ecosystem and our partners to make sure that they can easily activate that technology and enable it for their customers. If we have all these powerful technologies inside the processor and the system, but developers have a hard time activating it, then it just doesn’t get deployed on a large scale. Part of the approach that we’ve taken is to produce tools, software toolkits such as OpenVINO and the Intel Developer Zone to make it easy to receive, activate and deploy the technology not just to a few set of large customers, but making those technologies scale to hundreds and thousands of customers worldwide.

Customers and partners also have access to pre-created Intel Market Ready Solutions for their developers. It’s important for us to use our ecosystem to give our customers to have different levels of integration ready to go.


Are there any particular sectors that the IoT could especially help push forward?

Each market sector we work with presents its own challenges and whilst there are common elements, some require bespoke solutions.

The sectors we work with include visual retail and transactional retail devices, industrial systems and control systems helping in manufacturing, robotics, smart cities activities, transportation logistics, digital learning classrooms, healthcare devices, financial services and automotive.

Given the wide range of areas that we cover, it’s important for us to pull together all those different tools, RFP Ready Kits and Market Ready Solutions to give customers choice and flexibility depending on their specific application needs.  

How big can the IoT really be? Are the possibilities really endless?

The opportunity is Massive, and growing fast.  According to some of the market reports that we’ve seen AI just by itself is going to stimulate the economy by $13 trillion worldwide by 2030. Studies have shown that this will impact jobs in a good way with around 58 million new jobs being created in the next five years because of these deep learning and AI technologies. It’s an exciting time for us and the sheer amount of data being generated is creating plenty of opportunities for Intel as well as the wider ecosystem that we partner with and our customers.   I’m excited about the future of IoT and Intel’s participation in the market segments.

Jonathan Luse, General Manager of the Internet of Things Group, Intel

Intel and Baidu team up on AI training

Wed, 07/03/2019 - 06:00

Intel and Baidu are teaming up to work on a new chip, designed for training deep learning models at ‘lightning speed’.

This was confirmed by the two companies during the Baidu Create AI developer conference, which was recently held in Beijing. Intel Corporate VP Naveen Rao said his company is teaming up with Baidu to work on the new Intel Nervana Neural Network Processor for Training, or NNP-T for short.

The joint effort includes both hardware and software designs with the purpose of training deep learning models at high speeds.

“The next few years will see an explosion in the complexity of AI models and the need for massive deep learning compute at scale. Intel and Baidu are focusing their decade-long collaboration on building radical new hardware, co-designed with enabling software, that will evolve with this new reality – something we call ‘AI 2.0,’” said Naveen Rao.

Intel and Baidu have been partners for years now. Since 2016, Intel has been optimising Baidu’s PaddlePaddle deep learning framework for its Xeon Scalable processors. Now, they’re optimising NNP-T for PaddlePaddle.

The two companies are also working on MesaTEE, a memory-safe function-as-a-service (FaaS) computing framework, based on the Intel Software Guard Extensions (SGX) technology.

VentureBeat believes Intel sees its future in AI. “The Santa Clara company’s AI chip segments notched $1 billion in revenue last year, and Intel expects the market opportunity to grow 30% annually from $2.5 billion in 2017 to $10 billion by 2022.”

Pages