From artificial intelligence to increasing your workforce's digital skills, there are huge opportunities to digital transformation – but the trick is how, and when, to take the plunge. Read more inside…
A concerted effort to bridge the digital skills gap is key but an inclusive, broad-church approach is needed to make the digital project a real success
There is general consensus regarding a three-pronged approach to digital transformation. Identifying and implementing the right digital technology to reach radically better efficiencies is just half of the story. Unless a business also puts the right processes and the right people in place, it will end up with a vanity project bound to fail.
At the focal point of the people aspect there lies the problem of accessing and retaining the right digital talent. The UK, and especially London, being a leading tech-hub with more software developers than any of its major competitors in Europe, businesses operating here are in a uniquely favourable position in this respect. An even flow of top talent is ensured by special visas for foreigners, as well as master-degree level courses in subjects such as AI, cybersecurity and the blockchain that feature among the top twenty on global league tables.
But given the breakneck speed that the tech sector is expanding at – it grows six times faster than other businesses – universities can’t bridge the digital shortage gaps by themselves,
One reason for this is that companies are in need of a full spectrum of digital skills – not just advanced ones. While postgraduates in data science, advanced digital manufacturing or IoT are instrumental in filling top positions in companies, there is also a slew of jobs requiring mid-level or lower digital skills: we’ll soon need legions of AI trainers, data assistants and digital project managers.
Higher education can only serve as one among the many digital talent pipelines that supply the missing skills each and every business sector is ready to absorb. Alternative educational institutions and schemes are, on the one hand, putting great pressure on universities to make their digital courses competitive and as relevant to industry as possible. On the other hand, the diversification of post-GCSE training in this area also means that the different digital-skills-teaching offerings don’t just compete with but also complement one other. As a result, both students and employees have access to much more diverse career paths now than previously.
Programmes such as the government’s recently launched Lifetime Skills Guarantee supplying young people without A levels with a free three-to-four-month course in further education are designed to funnel more talented people into the digital skills pipeline. And so are university conversion schemes, which enable students from non-STEM courses to pivot to a career in the digital economy provided they have a solid background in algebra.
Graduates, on the other hand, may get signed up for a course at their employer’s corporate academy either straight after joining or at a later stage in order to customise their academic training to a specific role. These academies set up by large, often multinational tech companies or consultancies have a sharp focus on a specific subject knowledge area pertinent to their profile, for example fintech, supply chains or coding. Unlike universities, they rely much more heavily on their internal experts’ specialist knowledge, social or un-formalised learning between colleagues and on-the job training.
A strand of corporate academies offers executive training too, where the C-suite has a chance to either build or tweak their digital knowledge to make better informed strategic decisions or gain the level of digital skills their role now requires. As an alternative, executives whose company doesn’t run a corporate academy can pick and choose from a wealth of targeted university modules to achieve the same.
The importance of digital skills beyond recruiting for digital roles
Seeing the abundance and versatility of available training opportunities it seems that the UK is still very well-positioned to provide the digital talent needed to fill the approximately 90,000 weekly vacancies calling for different levels of digital skills. However, regarding the overall literacy of the nation, the British are ranking 30 with 10 percent of the population lacking basic digital skills such as browsing, emailing and word processing. Meanwhile, a much higher percentage of the population doesn’t know how to use social media, a skill that will also become fundamental as millennials Generaton Z are increasingly taking over the labour market.
Higher education, already aware of the issue, is offering digital courses for students on arts or social sciences career paths as well. Free courses are available online for those who feel they need to familiarise themselves with digital technologies. Fittingly, the Institute of Coding – a consortium of about thirty universities, businesses and technological experts – has undertaken in its mission to not only assist a move towards a digital career or to perfect some specialist digital capability but also to improve digital literacy in general. People tend to fear what’s unknown to them. And the more segments of society learn to leverage digital tools, the less of them will feel disenfranchised by technology.
In a more digitally savvy environment, the all-important buy-in that is central to the success of digital transformation is so much easier to get. Employees with a certain level of understanding of digital technologies are easier to win over to support digital projects and can more effortlessly adopt a “beat the bot” employee attitude – which is also the title of a £5 million West Midlands pilot project aiming to help train the workers of the region for the jobs of the future.
Pitching new digital technologies to CEOs who are knowledgeable about them will be less of an educational project than a matter of winning your case substantiated by facts and specifications. Meanwhile, people in non-digital jobs can be more open to digital solutions as customers if they can use the underlying technology confidently, thus accelerating their uptake. But to be able to get the people aspect of digital transformation right, we need all hands on deck: educators, businesses, the government and people – or individuals – alike.
By Zita Goldman
Investing in current employees is key to meeting new business needs and increasing retention
There’s a spike of job seekers in the market, yet it’s been increasingly difficult for employers to fill in-demand roles – many of which are tech positions.
These tech roles are critical for an organisation’s accelerated digital transformation. According to 2020 analysis from Gartner, only 16% of new hires possess the needed skills for both their current role and the future.
We’re seeing this trend mirrored within iCIMS’ Talent Cloud, the leading cloud platform for recruiting, which processes three million job postings, 75 million applications and four million hires per year. Recent interactions within Talent Cloud show that in 2020, it has taken employers an average of 68 days to fill a tech role – which is nearly 70% more than the average 40 days it has taken to fill a non-tech role.
No organisation can afford such a delay to hire in today’s tumultuous environment. Hiring teams can continue to chase external, digitally skilled talent faster and faster – or they can take a new direction and look inwards at the talent pool that is right in front of them: their existing employees.
Here are three things business leaders can do to tap into the power of their existing workforce and accelerate digital transformation:
Engage with employees
76% of employees consider opportunities for career growth as one of their top three non-financial motivators, according to a study from ClearCompany.
To create awareness of internal advancement opportunities, open lines of communication between employees, leaders and the HR department are critical. Employees should be encouraged to freely explore internal opportunities, and organisations should foster strong internal mobility programmes to help retain top talent. Two things are fundamental in engaging employees in their current flow of work, especially when they’re decentralised: internal career portals (including internal ‘gig’ portals) and communication tools. When you leverage these two resources and establish a culture that recognises and celebrates employee advancement, you will undoubtedly increase retention and overall employee engagement and satisfaction. Company leaders fear that internal mobility could disrupt their business operations; however, short-term team evolutions result in higher productivity, employee satisfaction and smaller turnover costs in the long term. With this approach, you’ll get a more loyal workforce that continues to steep itself in deep company expertise.
Recognise capabilities
In 2020, tech jobs received an average of nearly 32 applicants per job opening, while non-tech jobs received just over 23 applicants per job opening, according to iCIMS data. Multiply those numbers by how many open requisitions a recruiter is responsible for, and you’re stuck with a lot of time to sort and screen candidates, the majority of which aren’t qualified for the role.
Whether organisations are sourcing talent internally or externally, it’s clear that they need help identifying potential matches quickly. There’s an opportunity for organisations to leverage artificial intelligence (AI) to sift through the entire hiring system and recommend the best external and internal candidates based on interests, as well as hard and soft skills and experience. An AI solution that offers ‘explainability’ behind its matches can aid recruiters and hiring managers in employee conversations about transferrable skills. An AI-led automated review of viable candidates makes it easier for HR and recruiting professionals to reduce bias and focus on the interview process with top candidates, empowering them to make hires faster than ever before without the need to outsource to an agency.
Invest in potential
94% of employees say that they would stay at a company longer if it simply invested in helping them learn, according to LinkedIn’s 2019 Workforce Learning Report.
Once an organisation has identified internal candidates that are fit for a new position, it should provide employees with proper training opportunities, whether they are in-house or external. Internal advancement opportunities aren’t limited to new roles. They also include the ability to identify virtual classes, webinars, certification opportunities and mentorship programmes, as well as stretch projects, as great ways to ensure employees are set up for success. As these employees gain exposure in other areas of the business, you not only help retain your top talent, but also transfer skills, goodwill and culture across the organisation.
A strategic approach to advancing top talent means investing in the evolving needs of both the business and its people. To keep up with this ongoing evolution, organisations require one thing – a diverse, digitally skilled workforce that is willing and able to pivot and grow.
Amy Warner is Director of Talent Acquisition at iCIMS. To learn more about iCIMS’ Talent Cloud and the organisation’s data-driven insights, click here.
In marketing, true digital transformation is an ongoing evolution of attitudes and capabilities, not achieving digital nirvana.
Thanks to the havoc of 2020, businesses don’t have a choice but to transform. According to Cisco Systems, “At least 40 per cent of all businesses will die in the next 10 years… if they don’t figure out how to change their entire company to accommodate new technologies.”
The pace of change has always been rapid in digital marketing – from algorithm updates, data privacy legislation and new tools and technology, to fierce competition and changing customer expectations, the landscape shifts often and without warning.
Most organisations have embraced some form of digitisation. But the act of true transformation is one of capability and culture. From being solidly set up for marketing on your terms, to thriving in a constant state of flux, you need to be ready for anything.
At Jellyfish, we believe true readiness must manifest on four key fronts:
1. Readying your data
Do you understand your current data flows and what they are telling you? Are you able to analyse them in real time, via smart dashboards and feeds, or do you wait for that quarterly report?
Digital transformation starts with a clear-eyed appreciation of the true marketing potential for data – whether your own, or open source – to improve your customer experiences or marketing decisions in the moment. It’s critical to be able to look at the now, in time to adapt for what’s ahead.
Data readiness is about having the knowledge at your fingertips to inform or inspire agile decision-making, rather than being constantly behind the curve.
2. Adapting your culture
Digital marketing efforts are inherently connected to the customer perspective. Each failure to link channels or tactics in a marketing ecosystem represents a new kind of opportunity cost.
Imagine each advert, social post, email, blog or product page as a separate salesperson. Even if each is fantastic at what they do, they still need to cooperate, talk to each other, and understand their roles. If they fail to do this – or are incentivised to fight over the same sales commission – they’ll drive customers away.
Yet this is exactly the world many marketing departments are set up to perpetuate – both internally (a “social” budget versus a “performance” budget) and in terms of connections to the wider business (versus more integrated strategies encompassing ecommerce or customer servicing). And they are missing many value-creation opportunities along the way.
Transformation is always, at its heart, cultural. Is your organisation ready to move from the safety of extensive, pre-launch plans to a messy world of constant in-market experimentation and measurement? A fearless willingness to test, learn and refine on the go? If so, you’re nearly ready.
3. Integrating your technology
Ask the right questions of your data, give the right incentives and training to your people, and you will soon start to test the limits of your marketing technology.
Digital transformation will ultimately require an integrated marketing technology platform. This sets a foundation for proper cross-channel performance measurement, regulates wasteful duplication of effort – and critically, scales in response to additional integrations as they become available.
4. Optimising your creativity
There are some things in marketing that have always held true. Great creativity still cuts through and can deliver better ROI than the smartest media buy.
However, in digital, your creative needs the same adaptability as your data, people and tech. The most agile, data-driven media plan is nothing if the content it serves up is a poor fit to mindset, mood or moment. Whether through shrewd planning or dynamic execution, the creative assets you deploy must be as digitally responsive as the rest of your ecosystem.
Google sees a 150 per cent improvement in performance when creativity and media have been properly integrated. Agile asset optimisation ensures your marketing story can not only keep pace with the digital ecosystem and culture you’ve put in place, it can bring out the best in it.
Rethinking readiness
The massive societal and commercial shifts of 2020 have revealed organisational fissures with unprecedented clarity. We should expect the pace to continue if not accelerate. The right way forward will vary from case to case, but it’s the adaptive skill underpinning true digital transformation that holds the key for every business. Hesitation is not an option. Change can happen quickly. Businesses must stay nimble – and ready.
If you are ready for digital transformation, contact Jellyfish.
How can your team thrive 10x better in the new hybrid workplace? Cisco offers some meaningful insights
The global pandemic has dramatically changed how we work and how we feel about going back to our offices. Working from home may have been an adjustment at the start of the lockdown, but according to Cisco’s Workforce of the Future survey (conducted with 10,000 respondents across 12 markets in Europe, the Middle East and Russia), employees want to keep hold of the many positives that have emerged in our new way of working.
Increased autonomy (63 per cent) and working well as a dispersed team (66 per cent) are two main benefits; in addition, 61 per cent want to keep hold of faster decision-making. Results show that employees see this as a watershed moment that challenges cultural norms around the workplace. They believe that the hybrid workplace is here, and it’s here to stay.
But it’s clear that there is a lot to do before employers and employees feel confident about this new hybrid work environment. And one size will not fit all: 87 per cent want greater ownership of defining how and when they use office space – enabling them to blend between being office based and working remotely. Take a look at a snapshot of the survey results.
So, how will companies be able to create this ideal work environment? A vast majority of companies say they could improve work environments with intelligent technology. And, having introduced some very exciting Artificial Intelligence (AI) innovations to help employees be more productive from anywhere and to help companies plan for a safe return to the office, Cisco Webex seems to be leading the transformation to the hybrid workplace with the goal of creating a 10x better meeting experience.
With video conferencing now a part of daily work life for most, it’s critical that companies nail the video meeting experience. Webex is full of features to help users before, during and after meetings. It delivers seamless collaboration with built-in AI technology that overcomes common friction points and it works in sync with the apps you love.
We’ve picked a few innovations and features that truly impressed us from their recent October launch, features that we think will become the necessities of tomorrow.
Show up the way you want
Cisco have enlarged the view so you can clearly see how you and your background will appear prior to joining a meeting, so you can personalise your appearance; and there are plenty of zippy virtual backgrounds to choose from.
Stop searching for how to ____ (fill in the blank)
Cisco’s new, intuitive and elegant design puts controls in the most logical place without covering shared content or your video, and it even adapts to your screen size. Want to chat with someone? Hover over them. Need to change your audio? Hover over yourself. Simple.
Image credit: Cisco
Make it yours
Hate it when your screen fills with a big grey box because the person who is talking isn’t sharing video? Now you can tell Webex to hide non-video participants. This is just one of many new customisable options. Customise once, enjoy every meeting.
End the background noise
Distracting noise is a thing of the past when you use Webex meetings. By the end of this month, technology from their recent BabbleLabs acquisition will both quash the whirr of the vacuum and enhance the speaker’s voice so you can hear clearly.
Technology to help workers return to the office safely —when it’s time
Whether they think a return to the office is 3, 6 or even 12 months away, companies know they must prepare now: far-reaching changes are needed. Companies need to optimise their space, address worker concerns about sanitation and social distancing, and plan how to communicate policies and information clearly. Cisco is giving customers the data and tools to make this journey back to the office, with some impressive enhancements and upgrades to their collaboration devices.
Even more environmental sensors in Webex devices
Cisco has pioneered the use of sensors in collaboration gear to provide important environmental datapoints to users, IT and employee experience experts so that they can ensure a safe and comfortable workplace. Previously they announced sensors that detect ambient noise levels and count the number of people in a space. These existing sensors—and the data they collect—become even more important now as they can identify underutilised or overcrowded spaces.
Additional sensors will also collect data on room temperature, humidity, air quality and light levels. And they've made their people-counting capabilities smarter. Thanks to machine learning, Webex will help ensure compliance with room capacity limits, counting people whether they are wearing face masks or not —no matter where they are in the room. Quickly identifying workspaces with environmental roadblocks lets companies take action to improve productivity and the quality of the meeting experience, for both in-office and remote participants.
Webex Room Navigator: new devices for inside and outside meeting rooms
Companies need a new approach to the meeting space to ensure safety and optimise use. Cisco has launched two versions of the new Webex Room Navigator. These are purpose-built devices that contain all the sensors mentioned above. They sit either inside or outside the meeting room to provide intelligent, safe room-booking for users and deep data for IT and facilities managers.
Image credit: Cisco
The out-of-room model makes it simple to find and book a space to meet: it clearly shows when the room is free and changes the status to booked when you enter. Activate it with touch or—to avoid touching a possibly contaminated surface—just use your voice. The in-room model provides alerts for social distancing, cleaning schedules and more, and allows you to book from inside the space, too.
Both versions have a digital signage mode, so companies can use the screens to convey important information. Even better: built-in "no-show" technology automatically senses if no one shows up for a scheduled meeting and frees up the room for another team’s use.
Webex Control Hub: deep, actionable workplace analytics
Cisco’s global workforce survey highlighted that fewer than half of all companies could measure room utilisation. This means they lacked the visibility needed to set proper cleaning schedules—a top concern of employees as they contemplate returning to the office. Webex Control Hub provides this much-needed visibility. It provides historical insights into both room utilisation and room environment metrics such as rooms booked but not used, median occupancy across all spaces, and ambient noise levels. These insights provide actionable information for workplace decision makers to optimise real estate utilisation.
If we’ve learned anything over the past few months, it’s that businesses need to be as resilient as possible. From a global health crisis to bad weather, being prepared for the unexpected is what can assure the survival of a business.
The future of work is simply about replicating, as closely as possible, the experience of physically being in a traditional workplace while eliminating dependency on a single location. It’s about having the ability to access and interact with everything we need: applications, data, storage (whether on-premise or in the cloud). Most importantly, it’s about retaining interpersonal relationships with our colleagues, business partners and customers, and doing so in the most efficient, integrated and secure way possible. This combination will make the workforce of the future ready —and successful.
To learn about all Cisco’s latest innovations read their blog or visit the Cisco solutions website.
Companies of all sizes and industries are switching to cloud solutions. SMEs are able to make the move fairly quickly. But the process is more complex for businesses with rapidly growing volumes of work. What should be considered when migrating to the cloud?
Cloud technologies are changing the way organisations tie up resources by enabling them to respond instantly to market opportunities and provide funds at exactly the right time. But with all the advantages the cloud offers, there are also a number of risks and pitfalls that need to be considered.
Preparation is everything
Like all major projects, preparation for a successful migration to the cloud is a vitally important component. First, a project team should be put together, which will deal mainly with the reorganisation of the infrastructure – and with it the migration. Converting the IT system by switching to a cloud solution can cause uncertainty and confusion. Transparent communication and professional change management are essential to ensure the process is not derailed.
What are your expectations of the cloud?
A simple question, which in practice is rarely answered at the beginning. The cloud offers many advantages such as scalability, flexibility and geo-redundancy. For a successful migration, determining the economic benefits and setting the corporate goals play a crucial role. For example, one company may seek greater IT agility due to fluctuating traffic, while another would like to improve the collaboration between remote teams. Each company must therefore define the reasons for a cloud migration based on the business model.
How much does it cost to migrate to the cloud?
The calculation of the so-called total cost of ownership (TCO) contains more than just a comparison of the initial investment in your own hardware with the recurring costs of a cloud environment.
You also have to consider the running costs in order to use, maintain and keep up to date with the respective IT solution over its entire life cycle – usually between five and ten years, depending on the project.
A comprehensive TCO analysis contains three main categories:
Capital expenditure (CapEx) – the initial purchase of hardware and software, and investment in a new datacentre facility
Operating costs (OpEx) – support costs for the hardware and software, salaries of IT employees and maintenance costs
Indirect costs – downtime effects on productivity, business agility, loss of sales and opportunities, plus other factors
For each of the three areas, costs must be fully recorded so that an economic comparison with a cloud solution makes sense.
Cloud providers often have very complex billing models, offer many services and additional features that are charged extra to the monthly rate. In order to stay on track, a clear definition of the requirements for the new cloud landscape really helps to compare tariffs from different providers effectively.
Important tip: an exit strategy should always be taken into account. If data is to be transferred from the cloud back to your server again at a later point, this is usually associated with high costs – referred to as the so-called vendor lock-in effect. It’s easy to get into, but not so easy to get out of.
Which type of migration should I choose?
Firstly, no matter what you choose, data must be cleansed beforehand so that avoidable costs are saved when storing in the cloud. There should also be an audit of the application landscape – ask yourself which apps are cloud-ready.
Afterwards, the decision can be made between three approaches, or a combination of several:
Rehosting
With this approach, also called “lift-and-shift,” applications are transferred unchanged without having to edit any code. Rehosting is possible in several ways: as hot migration, cold migration or as mass migration.
Replatforming
Replatforming adapts workloads to the new platform and optimises them for increased performance and scalability. However, the basic architecture of the application remains
unchanged.
Refactoring
The most time-consuming and resource-intensive migration model is refactoring, where migrated workloads are completely revised and rewritten. This enables applications to use cloud-native frameworks and functions.
Conclusion
Migrating to the cloud is complex due to the variety of infrastructure landscapes, applications and uses. It’s always a highly customised process. But with the right preparation and technical advice from IT experts, it can be an exciting challenge that brings many advantages. IT costs can be optimised by turning large one-off expenses into monthly running operating costs. Collaboration improves by allowing team members to work on workflows in real time, wherever they are, and always see the latest developments. Redundant processes are largely avoided and productivity increases.
OVHcloud’s preconfigured solutions are designed to accelerate your journey to the cloud, maximising cost control and increasing agility. In addition to offering scalable and flexible solutions, our enterprise customers also receive enhanced support, on-boarding and customisation. For more information on our hosted private cloud and public cloud services, please visit https://www.ovhcloud.com/en-gb/
Hiren Parekh, VP Northern Europe, OVHcloud
Jeremy Swinfen Green explores the ethical issues in emerging technology and proposes a simple framework that will help developers meet the expectations of wider society.
Emerging technologies such as AI present enormous opportunities for humanity. But they may well come with substantial risks as well, some of which may as yet be hard to detect. Agreeing an ethical framework to guide the development of applications in an area of emerging technology seems sensible: such a framework won’t eliminate risks but it could well reduce their likelihood and potential impact.
Ethics vs compliance
An ethical framework is a set of principles that can provide a solid base for applications that are consistent with the accepted social norms and moral principles in the society they are developed in. In the UK at least, these include honesty, fairness and human rights. In other words, ethical frameworks are about “doing good”, or perhaps more accurately “not doing harm”.
In business, ethics involves how an organisation and its employees conduct themselves. It’s not necessarily the same as compliance (with regulations or standards). You can be compliant and still be unethical, for instance. And in an area of emerging technology there may well not be regulations and standards available to comply with. At this stage, all an organisation has to guide them may be an ethical framework.
Ethics for technology
If an ethical framework is to be useful in an area of emerging technology, it needs to be accepted prior to any business activity that uses the technology. It’s needed when the initial business case for a new product or service that uses an emerging technology is being developed. It’s needed during the development phases. And it’s needed when the final product or service is rolled out, or when it is bought by a third party. It’s not something that can be bolted on as an afterthought.
You might argue that any product or service in an emerging area of technology needs to be “ethical by design”. Facebook might look very different today had it taken this approach, for instance Admittedly, it would probably be rather smaller, but profit is generally not an excuse for unethical behaviour. And Cambridge Analytica would probably still be operating successfully.
An example framework: artificial intelligence
Using existing ethical frameworks for professions like the law and medicine, it’s possible to propose a framework for the development and implementation of artificial intelligence (AI) technology. This might contain nine fundamental principles:
Fairness. Any AI application needs to be fair to people who use it or who are impacted by it. It must be free of bias or discrimination, whether intended or not. This means, for instance, that any data used to “train” it must be representative of the desired outcome and not an incomplete data set that would produce skewed results.
Accountability. The developers of an AI system should not be able to blame the system they have developed if things go wrong. A legal entity – whether an organisation or an individual – must have clear accountability.
Transparency. It must be possible, and ideally simple, to understand how the AI system operates – for instance, how it makes decisions. Because AI systems can “learn”, the way they make decisions could very quickly become obscure unless some form of decision audit trail is maintained, so that it becomes possible to track the way previous decisions have impacted on current decisions.
Security. AI systems must be secure against tampering, especially if they are being used on tasks that may affect individual people’s lives. This means (using cyber-security’s “CIA” model) that they must be:
• Confidential. Their outputs and any data they hold must be available only to authorised people
• Integrious (possessing integrity): Their algorithms and the data they hold must be safe from interference and alteration by unauthorised people
• Accessible. They must be secure from attempts to prevent them from undertaking their legitimate activities, and the data and outputs they provide must be available to authorised users
Agility. The world of AI moves fast. Any system needs feedback loops so that the way it operates can be improved over time or changed to take account of changed circumstances. This is especially true of an AI system which uses data from the past to project actions into the future: this is because it’s important to accept that the past does not always indicate what the future will look like. Systems should not be built with the assumption that they will always operate in the same way.
Diligence. During development and implementation anyone associated with building or managing the system should take due care to ensure that all relevant factors that could cause harm to others are taken into account. For instance, potential threats to people’s privacy or physical safety should be identified and mitigated. Of course, not all threats will be identifiable. In such a case a failure to predict and mitigate a threat won’t be an ethical failure – although it may be a business risk. But people who own AI systems should take care that they are not negligent in terms of how they look for and map out potential harms arising from the system.
Autonomy. AI systems should not prevent people from taking their own decisions about their lives: they should not reduce the “agency” that people have. This isn’t to say that an AI system shouldn’t deny someone a bank loan for example. But it should not be able to prevent people from going about their legitimate business or making their own choices of how to behave.
Safety. AI systems should always be designed so that they avoid causing physical or mental harm to users or people impacted by them. Any emotional harm should be avoided, unless it’s an inevitable result of a lawful decision made by the AI system (such as refusal of a bank loan), in which case it should be minimised as far as possible.
Privacy. AI systems should be developed an operated in line with the principles surrounding the protection of personal data. Note that it is important to accept that the harms AI can cause go far beyond privacy breaches, and that privacy, while important, should not be used as a proxy for the harms AI systems can create.
If organisations engaging with AI consider each of these principles, then the potential for harm from AI will be reduced substantially. This should be true whether they are planning to develop new AI applications or use those developed by third parties.
The upcoming period is crucial for the future of many organizations. Listen to the experiences, wishes and ideas of your employees now to win later.
Until recently, organizations were primarily focused on attracting the right talent, a process made more difficult by unprecedented growth and significant labor shortages. Now we are living in an era where mass redundancies loom large and organizations are taking drastic cost-cutting measures.
Companies are mainly focused on survival. They need to continue to serve their customers with reduced resources, while their employees are losing faith in the future of their organization. And we probably haven’t seen the worst of it yet. Who can even start to think about the future in this kind of situation?
Lessons in management from previous crisis situations
In this type of scenario, there’s a huge temptation to cut your coat according to your cloth, freeze budgets, halt onboarding programs and start reorganizing — or simply bury your head in the sand. However, previous crisis scenarios have demonstrated that businesses that panic and rush into wholesale redundancies often need years to recover from the lack of trust generated as a result. Employees do not easily forget how the organization behaved during a crisis.
Come out of the recession 10% better off than your competitors
The last thing you want to do right now is make rash decisions. You will gain much more by thinking in a counter-cyclical manner and dedicating yourself to a long-term strategy. This long-term strategy must take into account the long road that lies ahead, during which you will need the full support of your employees.
The strategy must fully draw upon the reason why your organization exists if it is to help maintain everyone’s trust in you. And it must drive your company to go that extra step further. If you can come out of the recession 10% better off than your competitors, you will be able to increase your market share and continue to reap the benefits for years to come.
Maximum alignment
Looking ahead to when the current crisis has passed will have a powerful motivating effect on your employees during the here and now. Your employees will know that the organization really stands for something and that, through their work, they’re making a contribution to society. They will also be less concerned about the future because they will be busy innovating. Ultimately, alignment is key. It is important to make sure that everyone is pouring their energy into the same thing and that all your employees are working together to defeat the same enemy: the current crisis.
Sixty percent of employees are at risk of burnout
However, many organizations are not currently exploiting the full potential of their employees. They need their employees more than ever, but they face a number of problems. For example, the beginning of the COVID-19 crisis was particularly challenging for many employees. If you suddenly have to work and relax in the same space, it can be difficult to find a good balance between the two. Effectory Pulse surveys of 123,000 employees across Europe have revealed that a mere 40% of employees have managed to maintain a good work-life balance in recent months. The remaining 60% are more likely to experience burnout.
The same data analysis has revealed that 33% of employees are not able to work effectively. This is either as result of their living situation, lacking the right resources, or difficulty collaborating with colleagues.
The business case for engagement
In order to be successful and stay successful, your organization must be able to adapt to specific, dynamic situations, such as COVID-19. This can only happen if your employees work with you, if they feel engaged. Did you know that employees who feel this sense of engagement are 30% more adaptable than employees who do not?
In short: you cannot create an agile organization if your employees do not feel engaged.
Listening, learning and taking action
How do you create this kind of engagement? By using “Continuous Employee Listening”. This doesn’t mean that you have to get your employees to fill out questionnaires all of the time. It means that you have the right resources to get valuable feedback at any time, for example, by using an onboarding survey, a team survey, or an exit questionnaire.
But listening alone is not enough. You also need to learn from this feedback and take action. Really listen to what people are saying; then stop and do something about it. Change, improve and create a sense of engagement.
Ask the right questions
The right listening strategy is all about asking the right questions. What is the quickest way to get the best out of your new employees? What can encourage better performance from your current employees? And how can you make sure that employees who are performing well will continue to work for you for a long time? If you ask the right questions, really listen and take the right action, your employees will feel more engaged, work harder for you, and stay with you longer.
Create a listening environment
All of this sounds great, but how do you go about creating a listening environment? You need to implement three elements on three levels.
The three elements are:
The three levels are:
Aim for more success
Once you start really listening to your employees, your organization will begin to reap the benefits. There are many benefits for organizations that use Continuous Employee Listening to respond quickly to what’s happening around them:
Want to find out more about really listening to your employees?
The coming months will be crucial to the viability of your organization in the future. Listening to your employees’ experiences, wishes and ideas now will help you to keep going through the coming months, affirm your organization’s reason for being, and continue to strengthen your long-term position. At Effectory we offer employee listening solutions for the entire employee experience.
Find out more about Continuous Employee Listening.
By Guido Heezen, Founder, Effectory
Harun Šiljak, Trinity College Dublin
Google reported a remarkable breakthrough towards the end of 2019. The company claimed to have achieved something called quantum supremacy, using a new type of “quantum” computer to perform a benchmark test in 200 seconds. This was in stark contrast to the 10,000 years that would supposedly have been needed by a state-of-the-art conventional supercomputer to complete the same test.
Despite IBM’s claim that its supercomputer, with a little optimisation, could solve the task in a matter of days, Google’s announcement made it clear that we are entering a new era of incredible computational power.
Yet with much less fanfare, there has also been rapid progress in the development of quantum communication networks, and a master network to unite them all called the quantum internet. Just as the internet as we know it followed the development of computers, we can expect the quantum computer to be accompanied by the safer, better synchronised quantum internet.
Like quantum computing, quantum communication records information in what are known as qubits, similar to the way digital systems use bits and bytes. Whereas a bit can only take the value of zero or one, a qubit can also use the principles of quantum physics to take the value of zero and one at the same time. This is what allows quantum computers to perform certain computations very quickly. Instead of solving several variants of a problem one by one, the quantum computer can handle them all at the same time.
These qubits are central to the quantum internet because of a property called entanglement. If two entangled qubits are geographically separated (for instance, one qubit in Dublin and the other in New York), measurements of both would yield the same result. This would enable the ultimate in secret communications, a shared knowledge between two parties that cannot be discovered by a third. The resulting ability to code and decode messages would be one of the most powerful features of the quantum internet.
Commercial applications
There will be no shortage of commercial applications for these advanced cryptographic mechanisms. The world of finance, in particular, looks set to benefit as the quantum internet will lead to enhanced privacy for online transactions and stronger proof of the funds used in the transaction.
Recently, at the CONNECT Centre in Trinity College Dublin, we successfully implemented an algorithm that could achieve this level of security. That this took place during a hackathon – a sort of competition for computer programmers – shows that even enthusiasts without detailed knowledge of quantum physics can create some of the building blocks that will be needed for the quantum internet. This technology won’t be confined to specialist university departments, just as the original internet soon outgrew its origins as a way to connect academics around the world.
But how could this quantum internet be built anytime soon when we currently can only build very limited quantum computers? Well, the devices in the quantum internet don’t have to be completely quantum in nature, and the network won’t require massive quantum machines to handle the communication protocols.
One qubit here and there is all a quantum communication network needs to function. Instead of replacing the current infrastructure of optical fibres, data centres and base stations, the quantum internet will build on top of and make maximum use of the existing, classical internet.
With such rapid progress being made, quantum internet technology is set to shape the business plans of telecom companies in the near future. Financial institutions are already using quantum communication networks to make inter-bank transactions safer. And quantum communication satellites are up and running as the first step to extending these networks to a global scale.
The pipes of the quantum internet are effectively being laid as you read this. When a big quantum computer is finally built, it can be plugged into this network and accessed on the cloud, with all the privacy guarantees of quantum cryptography.
What will the ordinary user notice when the enhanced cryptography of the quantum internet becomes available? Very little, in all likelihood. Cryptography is like waste management: if everything works well, the customer doesn’t even notice.
In the constant race of the codemakers and codebreakers, the quantum internet won’t just prevent the codebreakers taking the lead. It will move the race track into another world altogether, with a significant head start for the codemakers. With data becoming the currency of our times, the quantum internet will provide stronger security for a new valuable commodity.
Harun Šiljak, Postdoctoral Research Fellow in Complex Systems Science for Telecommunications, Trinity College Dublin
This article is republished from The Conversation under a Creative Commons license. Read the original article.
The term “digital economy” is as easy to quantify as South American exports. There is more data being collected than can actually be stored. It has become so widespread that we can’t determine the impact it has made on each of the traditional sectors. With many moving towards edge computing, more data is being stored on the “edge” of networks, as opposed to a on a universal server. This inevitably means we have more data. More data enables a broader oversight of our economy as a whole. The key is how we make the most efficient use of this data to maximise productivity.
Economics is the study of, among other things, how resources are best allocated for production, distribution and consumption, from consumer level to overall economies. It is upon this understanding that we have built varying different economic systems, such as a mixed or market economy, and these have been adopted and developed in numerous ways by different countries depending on leadership preference. However, when it comes to the digital side of the economy, all types of economic systems, can, have and will adopt new technologies as it enables them to climb up the value-chain ladder.
Most adults in developed countries have access to at least one card-based form of digital payment. The technology for digital payment was shaped long before it was accepted, yet most adults now carry little cash, if any, given the convenience of contactless payments. While the trust of these mediums at a consumer level took time, it took even longer to convince banks that this payment method was an improvement on hard cash.
As the “manufacturer” of the new payment form, the Visa Association had to convince the banks that this was the correct direction to take. The initial challenges were relentless – once Visa had managed to convince the banks that this could vastly improve their profits, the means to adopt were justified. However, once the card had made it into households, the next challenge was thet the nation’s infrastructure had not caught up – not every petrol station had an ATM, nor every coffee shop a card machine. It was then that the banks, in turn, had to convince the government to increase investment in UK infrastructure in order to normalise this payment method.
This is parallel with other, newer digital technologies and the economy. Within all sectors of the economy, those “manufacturers” of digital technologies need to persuade those within different industries to adopt them by demonstrating the return on investment achievable. Indeed, it is not straightforward to illustrate a return on investment when data is the subject. However, a little funding from the right source to develop case studies where companies are able to visualise the ROI will surely enable more and more companies to begin to implement these technologies.
The ability to accrue large amounts of data and analyse it to assist processes is becoming a necessity for businesses to stay aligned with the competition. This doesn’t apply to any one sector or industry in particular. This is a global unification. Moving towards a more digital and also a more circular economy means those resources are far better used.
Within every industry we are able to see how artificial intelligence and the internet of things are forming an inclusiveness in sharing information and ideas among countries and corporations alike. The sharing of data is much more manageable and accessible than it has ever been. We must exploit this from an early stage and use it to capitalise on a competitive advantage.
The newest form of digital currency – cryptocurrency – has had a very slow uptake in the marketplace, thanks to uncertainty and its decentralised nature. But once reservations are overcome, could it eventually become the main form of currency in years to come? Perhaps yes, as there is no need for intermediaries or a central bank. Society is understandably cautious when faced with new and different ways of doing things, but if we look back on the past decade, we didn’t initially support half the technologies now used daily. Whether it was due to fear, or the technology itself being in its infancy, we cannot be certain.
But what we can be certain of is the opportunity current developments will present over the next decade. Newspapers and television used to be where we went for information, but now we are able to gather a quicker summary of any global situation online. The world is evolving, instead of getting left behind let’s learn with it.
Unsure about how the digital economy will develop? Or perhaps you want to be put in touch with experts in digital adoption? We can help at www.gambica.org.uk
Follow Nikesh Mistry on Twitter and LinkedIn here.
Image provided by GAMBICA
By Nikesh Mistry, Sector Head, Industrial Automation, Gambica
As telemedicine continues to evolve at a rapid pace, it has become clear that providers and vendors must change their cyber-security practices to keep pace and properly protect patients’ privacy and security. Any failure to do so may lead to dire consequences.
The rise in cyber-threats in the medical industry
There is evidence of increasing cyber-threats in the market. The Covid-19 pandemic has forced many service providers to move their services into the virtual world. This digital migration has created a host of new challenges for telemedicine service vendors, care givers, clinics and patients.
The transition from traditional to connected healthcare is challenging. The number of observed cyber-attacks on IoT devices rose by 300 per cent in 2019 alone. It’s estimated that 50 billion connected medical devices will be connected to clinical systems over the next decade, which demonstrates the size and threat of the opportunity for hackers. When a cyber-attack is successful, patients and healthcare providers become vulnerable as sensitive health data is breached.
There have been two terrifying recent examples of ransomware attacks, in the US and Germany, that are examples of how vulnerable systems can be, and the potential consequences of a worst-case scenario. Unfortunately, these attacks are just two alarming examples of the current state of telemedicine cyber-security and reinforce why it is so important for companies to remain vigilant during these times.
Cyber-threats: inside the numbers
There are alarming trends: From 2018 to 2019, health record breaches rose from approximately 14 million to more than 41 million. Breaching HIPAA acts enforced by the Department of Health and Human Services’ Office for Civil Rights (OCR) has severe consequences: the average financial penalty for a breach in 2019 was close to £1 million. But the hefty fines don’t seem to be motivation enough for improvements in the health care market that are now desperately needed.
Stealing health records is a lucrative business for cyber-criminals: they can be sold on the dark web for close to £1,000 – 200 times the black market value of financial records. This makes health records the most valuable type of records being traded by criminals, since they provide a comprehensive and complete picture of a person’s health background and identity. Cyber-criminals can then harvest this information and sell them to forgers, human traffickers, terror organisations, hostile countries, drug cartels and other criminals.
Facing the challenges
We think about the challenges that our customers face every day and we want to help. At Irdeto, we offer modular cyber-security solutions and services to a wide variety of industries, including medical technology and telehealth companies. These tailored solutions protect software and medical devices from cyber-attacks, meet regulatory requirements and protect patient safety.
But at Irdeto, we don’t just serve big companies. We know that cyber-security in all industries, especially healthcare, is everyone’s responsibility. So we aim to level the playing field for smaller med-tech companies by enabling and empowering them with the ability to use our portfolio of market-leading cyber-security and our extensive patent library of security technologies. This enables them to protect their solutions while providing optimal security that keeps patients safe.
You need credible cyber-security to succeed
Due to constant threats and industry challenges, it should be mandatory for top professionals to practice effective cyber-security management. Stakeholders need to re-assess how they tackle cyber-risk and implement effective cyber-security measures within their respective areas. Protecting software running on medical devices should now be the top priority for all connected device makers and telehealth providers.
Software applications are becoming an increasingly significant part of the attack surface, and unprotected software applications can leave a trail of breadcrumbs that can be reverse-engineered to disrupt a platform that delivers vital care.
As healthcare continues to evolve, who can tell what the next vulnerable entry point for hackers will be? Successful providers in the coming years will be those who can employ cyber-security strategies that protect patient data and stop hackers from breaching the network. Businesses simply cannot move forward and innovate in the telemedicine space until there is ubiquitous cyber-security that protects the most crucial data from those who badly want to access and corrupt it.
Curious about your cyber-security options and want to learn more about our offering? Please visit us at https://irdeto.com/
By Steeve Huin, Chief Marketing Officer, responsible for Business Development, Marketing and Strategic Partnerships at Irdeto
Maureen Meadows, Coventry University
Is the UK government planning to revive identity cards for the internet age? The decision to scrap its national ID cards and database in 2010 means the UK is one of the few developed countries not to have such an identity scheme. While this was seen as a victory for civil liberties campaigners, some now argue that the lack of a simple way to prove who you are, especially online, is holding back the digital economy and improvements to public services.
With this in mind, the government recently announced plans to pave the way for a new digital identity scheme, which some media outlets have called digital ID cards.
In reality, there’s no single agreed definition of what a digital ID is or looks like, so saying the new system will be similar to the unpopular card scheme is misleading. However, the UK government is a long way from demonstrating that it could operate an ID system that follows the principles of privacy, transparency and good governance it claims to support and that are needed to protect people’s rights.
The government’s main argument for a digital ID is the supposedly growing need to prove who you are. For example, anyone buying or selling a home in the UK has to prove their identity multiple times with multiple pieces of evidence. This is time consuming, repetitive and expensive, often requiring face-to-face verification or sending sensitive documents in the post.
A digital identity should help to simplify the process, reducing the friction and costs associated with a stressful series of transactions. It could make it easier to register with a GP, or prove your age if you don’t have a driving licence or passport. And, the government argues, a digital ID could play an important role in preventing identity fraud – a serious and growing problem.
E-government
Other countries appear to have had success with digital identity programmes. Estonia has a mandatory scheme that includes an ID card but can also be used as definitive proof of identity online. It’s used for travelling, national insurance, checking medical records, submitting tax claims, accessing bank accounts, ordering prescriptions and even online voting.
And the scheme appears to have benefited the country, as part of its general mass digitalisation. In fact, Estonia has been called the most advanced digital society in the world. It has one of the world’s best rates of tax collection, supported by e-taxation. Participation in elections has increased, alongside the introduction of online voting. Around 99% of public services are now online, available 24/7. Its healthcare system is highly cost effective, supported by significant investment in digital records.
Plans for a digital identity would most likely be part of the government’s wider attempt to improve data collection and used to inform policymaking and implementation. A digital identity scheme, with a unique identifier for each citizen, could help create government to join up a variety of personal information currently held in separate department databases. This could lead to new insights on citizen behaviour and improved government decision-making.
So what could possibly be the problem with such a supposedly advantageous system? One of the risks is that a poorly implemented digital ID scheme could make it harder for some people to access services, particularly those with limited access to the internet or skills in using it. Some charities have already noted in a government consultation that a significant amount of their time is dedicated to supporting vulnerable users to navigate government online services.
Another risk is that people may feel that a “joining up” of data across government will damage their privacy. Even if we have (willingly or unwillingly) shared our data with government already, we may be relying on the notion that most officials couldn’t that easily pull up (and potentially abuse) all our information in one place. The loss of such protection could further undermine trust in those who have access to our data, from the government itself to our local GPs.
But if Estonia can make it work, why can’t the UK? One of the reasons for Estonia’s successful digitalisation is that it was in many ways starting from scratch, and able to design its digital ID as part of a new wider system. The UK, on the other hand, has numerous separate existing digital systems that would need to be integrated.
Creating more problems
This problem also has implications for the UK government’s plans for more data-focused policymaking. As the Institute for Government put it, “A No10 data science unit could create more problems than it solves.” The thinktank noted that much of the data collected, stored and processed by government departments is of poor quality and subject to significant gaps, difficult to find and share, and locked away in legacy IT systems.
Building a well-rounded picture of government and society, and empowering the rest of Whitehall to use data science, will require an overhaul of data use that goes way beyond the abilities of small team in Downing Street. The UK needs a long-term plan backed up with practical steps, a much greater willingness to invest in skills and systems, and clear high-level leadership.
Put simply, the government needs to learn to walk before it tries to run with a complex and highly sensitive digital identity scheme. It has highlighted six principles that it wants to guide the project (privacy, transparency, inclusivity, interoperability, proportionality and good governance). But these are very broad and there’s no indication yet of how they will be followed.
A UK digital identity will only work if it allows people to stay in control of their data, who it is shared with and what they are allowed (and not allowed) to do with it. Without this, we can expect to see a revival of the campaign that helped kill ID cards the first time around.
Maureen Meadows, Professor of Strategic Management, Coventry University
This article is republished from The Conversation under a Creative Commons license. Read the original article.
The rise of globally dispersed workforces and new work-from-home requirements are placing extraordinary pressure on every organisation’s security. And wherever there is upheaval, cyber-criminals thrive. Alongside the devastating health and economic impacts of the coronavirus pandemic, we have also seen a huge escalation in cyber-attacks, especially in ransomware, phishing and spear phishing, all of which have increased exponentially. As a result, here in the UK, the average cost of a data breach has grown to nearly £2.7 million, according to IBM research. Additionally, reputational harm can of course be incalculable, which is why it is so important to ensure that data is appropriately handled, classified and stored.
For example, rising email volumes as a result of remote working and the digitisation of manual processes are further escalating this risk, which is particularly poignant as parts of the country face further lockdown measures in the coming days and weeks. Users are operating away from the normal office “look and feel”, and it is easy for them to get distracted by events which would not occur in an office environment; mistakes inevitably happen.
This is where employees play a vital role in ensuring the organisation maintains a strong data privacy posture. For this to be effective, organisations need to ensure that they provide regular security awareness training to protect sensitive information. That said, awareness – both among businesses themselves and their employees – that connected devices are not always secure is relatively low, and businesses need to tackle this head-on. In terms of how they go about doing this, they must invest in end-user training and education programmes. Users are an organisation’s most important resource, and they must be trained so they become a security asset rather than a liability; a critical part of an organisation’s security posture, not excluded due to the associated risks.
A firm must foster an inclusive security culture and ensure users are continually trained so that their approach to security becomes part of their everyday working practice – it should be embedded into all their actions and the ethos of the business. This also puts the onus back on organisations to invest in technologies that help stop the inadvertent and accidental misuse of data.
One way to do this is through data classification tools, which not only help organisations to protect their data by attaching appropriate security labels to the files themselves, but also to educate the user to understand how to treat different types of data with different levels of classification and sensitivity. This means that your data is classified according to its sensitivity and importance to the organisation. Doing this enables businesses to prevent sensitive data leaving the business, either inadvertently or maliciously.
Today, data classification offers an increasingly persuasive answer to help prevent unintended data leakage, and enables organisations to maintain compliance with regulations such as GDPR, HIPAA, CCPA and more. Furthermore, data classification helps to extend the value and effectiveness of wider data security and governance ecosystems – adding new levels of intelligence to data loss prevention and data archiving solutions.
Likewise, attacks on large corporates often begin via their smaller suppliers who may be less well defended. So as businesses expand their ecosystems and work with more suppliers, often blurring the boundaries of the network, it is critical to protect not just users and their devices, but also to protect suppliers and the supply chain ecosystem.
Here at Boldon James, we view data security as an ecosystem where data classification is one of the core tenets and the foundation of any good data security plan. Our data classification solutions enable the automation of many data management tasks to enhance the performance of third-party cyber-security solutions that read the metadata applied during the classification process. This metadata can determine how a piece of data should be treated, handled, stored and disposed of –over its entire lifetime. This ultimately protects the data from any inadvertent or accidental mishandling.
Remote working looks set to be the modus operandi for the foreseeable future, so making sure that you wrap tools around the users and your whole business ecosystem will help to ensure that you don’t become one of the data breach statistics.
To find out more visit www.boldonjames.com
By Adam Strange, Data Classification Specialist, Boldon James
2020 has been a disappointing year for customer service, which means it has been a disappointing year for customers. Almost every customer-facing website, whether it belongs to an energy supplier, telecoms provider, retailer or insurer, now has a homepage alert blaming the pandemic for long call-waiting times or reduced operating times.
Initially, consumers were forgiving of these slower response times; after all, the ‘we’re all in this together mentality’ applied as we collectively juggled the trials and tribulations of the shift to working from home. But many months on, as remote working has become business as normal, patience isn’t waning – patience has run out.
The customer service paradox
At the core of most current customer service is a paradox. Teams have worked hard to embrace new digital channels – such as messaging apps, live-chat and social media channels – to communicate with customers. However, these channels are predominantly tied to fixed contact centre locations, using on-premises software, which make them unsuitable for working remotely. Although the pandemic highlighted how unfit for purpose these legacy systems are, even without the Covid catalyst these ageing systems have been long hindering those in need of a modern customer service team.
Gain flexibility with the cloud
Forward-thinking organisations are already using cloud technology, with software accessed through a browser, to operate their contact centres remotely. The cloud revolutionises the way businesses share files, communicate within teams, manage supply chains and more. Combining it with flexible and remote working policies will make customer service departments agile enough to deal with any situation and adapt more quickly to technological changes.
While one’s own workers are important, it is essential to foster a consumer-centric model rather than an inwardly facing business-centric approach. Although sectors have differing demands for their contact centres, for 90 per cent of customers the objective is simple: they simply want their issue resolved. Moreover, the modern consumer expects to communicate with a company with the same ease and convenience of their personal interactions.
In the same way that m-commerce has become a necessity, adopted by the vast majority of e-merchants, businesses must adapt to fresh consumer expectations and set up conversational experiences on the channels their customers most frequently use. Through these channels, companies can set up a complete purchasing journey, from researching information to payment and problem resolution.
AXA: a case study for digital transformation
AXA – one of the world’s largest insurance providers – decided to increase the number of its digital touchpoints to deliver frictionless service in the Swiss market. The RingCentral Engage Digital platform gave AXA’s customers the choice of contacting them on three messaging apps (Facebook Messenger, Apple Business Chat or WhatsApp) and multiple channels (via Twitter, email, live chat or phone call).
When AXA implemented RingCentral Engage Digital, it gained a single cloud-based platform to manage all its digital interactions across every channel. At a time when speed and resources were at a premium, the AI-based smart routing system pushed incoming messages to the right agent based on urgency and skillset.
Last but by no means least, it dealt with the infuriating situation of customers having to explain their situation each time they spoke to the company but on a different channel. Engage Digital allowed AXA to merge customer identities into a single profile, so that a customer starting a conversation on one platform was immediately identified if they made contact on another channel, along with their previous interactions.
AXA improved resource handling, had no duplicate conversations and saw a 10x increase in case resolution and a 50 per cent reduction in the time taken to resolve issues.
What’s more, the new call deflection approach of AXA improves the customer journey. For specific situations such as a damages claim, customers are offered an alternative digital channel to complete their enquiry. This strategy reduces the customer's waiting time and helps AXA better manage incoming requests.
To discover how a digital engagement platform can enhance your customer experience strategy, visit RingCentral.
By Julien Rio, CCXP, Senior Director of Marketing, RingCentral
Effective cybersecurity is the cornerstone of modern hyperconnectivity
The modern digital economy is evolving quickly. With the rise of online shopping, challenger banks and fintech innovation, as well as the likes of cryptocurrency and blockchain, new transformative technologies are rapidly changing the way organisations do business and how consumers make everyday payments. Of course, some recent developments have been accelerated by the financial services industry, where these technologies present a future of a frictionless digital economy. However, such an economy relies on increased hyperconnectivity and exposes users to a whole new range of threats, from potential ransomware to phishing scams to ‘shady’ transactions. With an increasingly digital economy, cybersecurity will of course become ever more important. As revealed by the Cyber Security Breaches Survey 2020, the frequency and sophistication of cybercrime is rising steadily year on year. Almost half of businesses (46 per cent) and a quarter of charities (26 per cent) reported cybersecurity breaches or attacks in the past 12 months. Those reporting cyber incidents are also experiencing more frequent attacks this year, with some being targeted at least once a week. To deliver a safe digital economy, consumers and businesses must ensure they are fully cyber-secure, starting with the basics and everyday authentication.
Passwords remain the de facto mechanism for authentication. However, the problem with passwords is that when they are leaked or captured by a third party, they can be used to gain unauthorised access to an account or system. This has driven many organisations to add an additional layer of security or authentication. Many sites, for example, now ask users to associate a mobile phone number with their account. The premise is that two-factor authentication does not allow anyone to log in to an associated account without access to the phone and the updated password. This should in theory prevent any third party from hijacking that account as they do not have the registered phone that generates a code to log in.
It is possible that biometric authentication will become the standard form of providing credentials in the future, although it should be combined with multi-factor methods. Many smartphones already have biometric readers or sensors incorporated into their hardware and the full deployment of interoperable biometric solutions should significantly reduce identity theft, benefitting the economy greatly with more reliable authentication solutions. That being said, while there are numerous biometric solutions, none can be considered a silver bullet and one size certainly does not fit all. The accuracy of facial recognition varies greatly due to factors such as lighting, angle and camera sensitivity. Likewise, fingerprint readers are affected by temperature and other factors and are not necessarily a ‘hackproof’ solution, as we leave fingerprints that can easily be copied on every surface. When fingerprints are scanned the finger is flat and will therefore be different when it is misaligned, wet or dirty, leading to issues when signing in with two-factor authentication. There are, of course, hardware security keys that are excellent for protecting accounts – however, hardware security tokens involve additional costs for the device and require users to carry the token on their person, proving cumbersome.
Voice recognition is becoming another viable biometric technique for authentication. However, it must be measured against both the ambient background, such as when speaking in a bar, on a train, on a street or at a sports arena. There has not been much movement in trying to implement voice authentication, but it does play a part in some multi-factor systems. The main barrier to any widespread adoption has been the problem of aural eavesdropping. This is where casual or malicious bystanders may overhear private information spoken by screen readers or users.
The objective of biometric identity authentication is to establish a bond of trust between a system and the user who is requesting system access. More specifically, identity authentication ascertains a level of trust regarding who the user claims to be. It follows that the more accurate the chosen authentication method the user can present to prove their identity, the stronger this bond of trust becomes.
To conclude, while biometric, authenticator apps or hardware token solutions may not provide us with the complete authentication solution businesses and consumers need to more fully secure their accounts and systems, they will play an increasingly important role in the future. Until some superior mechanism is created, proper multi-factor authentication via hardware security keys is the gold standard and will offer a strong line of defence.
Kevin Curran is a senior member of the Institute of Electrical and Electronics Engineers (IEEE) and Professor of Cyber Security at Ulster University. To find out more on the rise of cybercrime, visit the National Crime Agency.
Kevin Curran is a Professor of Cyber Security, Executive Co-Director of the Legal Innovation Centre and group leader for the Cyber Security and Web Technologies Research Group at Ulster University. His achievements include winning and managing UK & European Framework projects and Technology Transfer Schemes; however, he has also made significant contributions to advancing the knowledge and understanding of computer networking and systems, evidenced by more than 800 published works.
Previously the founding Editor in Chief of the International Journal of Ambient Computing and Intelligence, Kevin was the recipient of an Engineering and Technology Board Visiting Lectureship for Exceptional Engineers. He has also served as an adviser to the British Computer Society in regard to the computer industry standards and is a member of the BCS and IEEE Technology Specialist Groups and various other professional bodies.
As the world continues to navigate through the ups and downs of the Covid-19 pandemic, businesses are taking stock of its impact and deciding where to place their bets going forward. The road ahead is going to be challenging for businesses, with many obstacles still to be faced, and even more so for traditional human-to-human industries such as manufacturing and B2B sales. For several years, buyers have steadily been shifting purchasing from traditional face-to-face sales representatives to self-service through digital channels. Now that sales people are unable to meet customers in-person, digital selling technology is no longer a ‘nice to have’, but a ‘need to have’.
eCommerce has dramatically increased in importance as part of a business’ selling strategy during this crisis and will continue to be crucial for long after. Digital transformation plans must focus on customer-facing initiatives underpinned by digital selling and eCommerce channels – or else buyers will simply look to competitors that can connect with them where, how and when they demand.
Trends influencing buyers
According to our recent Covid-19 B2B Buying Trends Report, 37 per cent of businesses have been primarily purchasing through digital channels since the start of the pandemic, a 29 per cent increase compared to before. That figure is expected to rise to 40 per cent post-pandemic. But not all business suppliers were prepared for this quick shift. In fact, two-thirds of buyers felt they faced challenges in working with some of their vendors through the pandemic. With uncertainty still very much a part of the current economic climate, leveraging digital commerce for forward planning will be crucial in giving businesses the best chance of survival.
There is urgent need for businesses to ensure all avenues of selling ultimately come together into a defined, holistic omnichannel execution model that supports business recovery from Covid-19 and into the next normal. Enabling these channels and self-serve interactions requires a clear understanding of, and personalisation to, customer expectations and behaviours. Many buyers are changing their habits overnight, which requires a dynamic response from suppliers introducing new systems and tools. Vendors who have reliably demonstrated that their desired goods or services can be delivered as communicated are more likely to find transactions completed.
AI-powered buying insights
Delivering on this level of trust and transparency requires underlying technology well beyond just the completion of a digital transaction. Buyers are increasingly demanding responsiveness, transparency and proactivity. They want fast, personalised responses to their inquiries, immediate transparency in inventory and pricing, and vendors that proactively share opportunities for them to gain more value through the supplier.
Artificial intelligence is one such solution that is helping businesses adjust to this rapidly changing world. AI helps minimise the assumptions and ‘gut feelings’ found in traditional B2B businesses. It enables an organisation to look at a huge amount of data and extract early patterns and trends from potentially sparse information that humans may otherwise miss or misinterpret.
AI offers a means of establishing a view of the whole sales process, across every channel. It can offer a view on who to sell to, what is the right price, what volume of product should be ordered and on which channels it should be sold. Taking things one step further, suppliers can use AI to predict and anticipate customer needs so they can be met across any channel the customer chooses to engage through. They can also collect and use data on purchases over time to develop more personalised offers that truly align with those customers’ wants and needs, leading to greater customer loyalty over time.
The ability to anticipate what a customer needs – and to deliver that across their preferred channel – creates a consistent buying experience, and in turn builds better business resiliency. The volatility of the current environment has thrown traditional business methods on their heads and has forced rapid digital transformation, especially across the enablement of eCommerce. Organisations that leverage AI to develop and execute the right omnichannel selling strategy will come out on top.
By Will Lovatt is General Manager EMEA at PROS.
Companies are under increasing pressure to think and act globally. However, the immediate priority of focusing on their local marketplace and customer base means they don’t always have the breathing space to think about the bigger picture. For British companies to act local and think global, finding the right partner can make all the difference.
The coronavirus crisis has had a huge impact on our personal and professional priorities, so no wonder it has caused businesses of all sectors to rethink their strategies and concerns too. TCS’ recent Covid-19 Business Impact Survey of 300 global business executives revealed that more than 90 per cent of businesses have maintained or increased their digital transformation budget. Throughout history, business disruption has been followed by opportunity, and now is no different.
Business opportunity and innovation are available to all organisations if they are willing to embrace four key behaviours:
This last behaviour is the most important in this context. As companies adapt and recognise the importance of investing in digital transformation, there will be much that the internal corporate structure cannot solve, which is when many will turn to trusted partners. In doing so, UK organisations will become more tech-forward, driving innovation and strengthening both their business and the wider economy.
For example, TCS developed Secure Borderless WorkspacesTM (SBWS), a sophisticated and fully integrated model that enables staff to work remotely. Through SBWS, we were able to deploy TCS DynaPORT, a state-of-the-art operating system, at Forth Ports’ new ferry terminal in London, while staff worked from home. This opened up new shipping opportunities for Britain and eased supply chain pressures.
With the impact of Covid-19 still leaving many industries reeling, TCS is using its contextual and global knowledge to help businesses adapt and thrive – from establishing dynamic supply chain systems for retailers, to developing solutions for remote talent management, to supporting financial organisations and transactions with cloud technology. The opportunities offered by technology are enormous, but they are often missed by enterprises that lack the broad industry expertise or extensive experience of digital transformation and technology solutions.
As we operate in the UK, we make sure that we invest in UK plc too, whether through the number of people we employ or the brands we work for. TCS has vigorously supported its British customer base and 19,000 UK employees from the very start of the crisis. We enabled 99 per cent of our UK employees to work from home using our SBWS operating framework in just a matter of days.
We were also able to support our employees when they needed it most, offering benefits such as an employee assistance programme (EAP), health information and advice, and a virtual engagement team set up by the HR department to promote wellbeing and to help keep employees connected. Keeping employees happy is vital to the survival of business in the UK, as by doing so motivation and productivity stay high.
A number of our major UK customers are also benefitting from our innovative operational framework. With our help, Halfords was able to adapt its systems to meet increasing consumer demand for bikes and outdoor equipment. This crisis has shown that by investing in the right technology, companies can boost their sales and increase their market cap as more commercial opportunities become available to them.
It’s also vital for the UK economy that companies invest in the future skills this country needs to thrive in the global digital economy. The nation’s young people have a real love of technology, but there is often a disconnect with how this technology is made. Through extensive STEM outreach, TCS is helping to bridge the knowledge gap. Working closely with charities, social enterprises and TCS employees, TCS’ IT Futures programme has already reached 300,000 students since its launch in 2013. Adapting to the digital economy is an ongoing challenge and we continue to evolve and invent. But the UK needs to embed this in the education system at every level if it is to produce a workforce with the digital skills it needs to succeed, here and internationally. Technology companies can go some way in making this happen.
I’m confident UK plc is going to bounce back even stronger from what has been one of the most challenging years for businesses. It is imperative that global companies that operate here contribute and local companies embrace that contribution. As the global economy starts to recover, it will become increasingly focused on a digital tomorrow. To avoid falling behind their international competitors, British businesses will need to become even more tech focused. This includes finding the right partners to invest in them and their communities. With its worldwide reach, TCS has the expertise and resources to help drive British business success, as well as invest back into the country by promoting digital skills and supporting our local workforce.
By Jim Bligh is Director of Corporate Affairs for UK & Ireland at Tata Consultancy Services.
Michel Serafinelli, University of Essex
For years, we have been promised a work-from-home revolution, and it seems that the pandemic has finally brought it to pass. In April this year, at the height of the first wave of coronavirus, 47% of people in the UK were working from home, the vast majority of them doing so because of the pandemic. In a sense this is overdue: the work-from-home potential for UK employees is 32%; in France, Germany and Italy between 24% and 28%.
This structural transformation has the potential to at least partially undo another transformation from the previous century. With the decline of manufacturing in the United Kingdom after the 1970s, some cities – incuding Hull, Sheffield, Bradford and Stoke-on-Trent – entered a spiral of high unemployment and out-migration that has lasted to this day. This trend is echoed in other “rust belt” cities such as Saint-Etienne in France, Wuppertal in Germany and the American city of Detroit.
The rise of teleworking could end that spiral – if the right conditions are met.
The changing workplace
It’s unlikely that telework will end when the pandemic does – we will instead probably see workplaces encouraging a mix of in-office and home working. Some organisations may start asking workers to be in the office for only two to three days per week, while others may opt for a “conference model” (that is, a few consecutive days or a week per month for all employees).
This does not mean the death of big cities. London will probably stay attractive and innovative thanks to its very strong initial advantage. San Francisco and Seattle in US, Munich in Germany and Amsterdam in the Netherlands will all remain hubs for knowledge workers. Scholars believe face-to-face still rules when it comes to creativity, and such cities provide an environment that is conducive to innovation.
But rust belt areas are cheaper and can attract skilled workers to regularly spend more time there once the pandemic is over.
The job multiplier effect
How can formerly deprived cities thrive after the pandemic? To understand the potential for revitalisation of rust belt cities, we can invoke the job multiplier effect. This is where the presence of skilled workers helps create other jobs through increased demand for local goods and services. For example, after their day on Zoom (at home or in a local co-working space), skilled workers will want to go out. In this way they support a barista, a waiter, a chef and perhaps a taxi driver. Some will decide to renovate the house they live in, and ask a local architect. Once or twice a week they go for yoga. They may need a dogsitter when they travel.
This is not the only mechanism that could help with local revitalisation. Some of the people regularly spending more time in rust belt areas would be entrepreneurs, and we may see new business creation, as they seize new opportunities in industries such as culture, renewable energies, tourism, quality agro-food or handicraft.
In principle, therefore, our increased ability to work from home could lead to new growth opportunities.
Will it work?
But there are important caveats. Not all rust belt cities will be able take advantage of the post-pandemic world. After all, there were large differences in labour market performance after the 1970s, when the aggregate number of manufacturing jobs started to decline.
In the UK, both Middlesborough and Slough had 44% manufacturing employment in 1970. But their experience was vastly different in the three following decades, with Middlesborough employment declining by 13% per decade and Slough employment growing by 12% per decade. Places such as Norwich and Preston in the UK, Bergamo in Italy, and San Jose in the US were traditional manufacturing hubs that nonetheless performed well in the decades that followed the start of manufacturing decline in their countries.
To understand why we may see large differences across different cities again with the rise of working from home, we first have to think about differences in what economists call human capital endowments – this relates to the skills of the workforce in a particular place. For example, if locality A has a greater share of the workforce with a university degree than locality B, it has a higher human capital endowment and is more likely to recover from industrial decline.
The skill level of the workforce is important for the task of local reinvention – in our research team’s analysis of the reinvention potential for cities, we used the share of the workforce with a university degree as a proxy for this. To distribute these advantages across the board, scholars studying declining areas have called for measures aimed at boosting training and facilitating the assimilation of knowledge and innovation.
Another important challenge is the digital divide – the gap in speeds between areas with privileged access to the internet and the rest of the country. In the UK this is more than just a gap between urban and rural parts of the country – inner-city areas in London, Manchester, Liverpool and Birmingham are also left behind. A large reduction of this gap was important for job creation before COVID-19 – it should be a top priority now.
Local amenities also play a role. For skilled workers with family ties in a specific area, once they decide to regularly spend more time outside London, the choice of location is often pretty clear. For skilled workers without such ties, factors such as the cultural and recreational activities on offer in a new city become important, especially since they are used to a vibrant selection in London.
Overall, rust belt areas in Western economies face some opportunities for regeneration with teleworking, but there are also several important challenges. To maximise the potential for success, governments should consider measures that boost training, investment in high-speed broadband and improve transportation links between these cities and London.
These kinds of investments would help smaller cities such as Middlesborough, Hull and Stoke-on-Trent take advantage of the new opportunities presented by telework. Otherwise Manchester and, to some extent, other larger cities such as Birmingham and Liverpool could be the winners, among the rust belt, in the post-coronavirus work-from-home economy.
Michel Serafinelli, Lecturer in Economics, University of Essex
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Andre Spicer, City, University of London
Anyone who tried to get their head around the financial crisis of 2008 soon found themselves drowning in an alphabet soup of BEITs, CDOs, CDCs, ETFs and MBS. When British novelist John Lanchester wrote about this world he commented that “you are left wondering whether somebody is trying to con you, or to obfuscate and blather so that you can’t tell what’s being talked about”. He wasn’t wrong.
One recent study shows how people are more likely to use jargon when they feel insecure. Led by psychologist Zachariah Brown, it shows how some groups use jargon specifically to make up for having a low social status.
In one experiment, they looked at 64,000 dissertations from hundreds of universities in the US and found that those written by students from lower-status institutions used more jargon. In another part of the study, they asked participants to pick a pitch for a start-up. When people were put into a lower-status position, they found they were more likely to pick jargon-laden pitches. In a range of other settings they noticed that when people found themselves in a low-status position, they were significantly more likely to reach for jargon.
Clearly, there are pitfalls to jargon. Research shows how it can be a major turnoff in the business world. One study found that knowledgeable investors were unimpressed by investment propositions that were filled with unnecessary jargon. Similarly, jargon can make non-experts see new technologies in a more negative light. Another study found that when new technologies are presented to people using jargon, they tend to see them as much riskier.
Jargon is, by definition, exclusionary. This means it can get in the way of understanding crucial information. One study found that the frequent use of medical jargon by doctors meant their patients didn’t understand about half of what their doctors said to them.
Even between experts, it can be counterproductive. A study of different subfields in ecology, for example, found that key terms would often mean very different things to different experts. This would then trigger heated but ultimately fruitless disagreements.
The upside of jargon
Jargon might be infuriating, but it’s also useful. Jargon sums up complex issues in fewer words. This enables experts to talk precisely to each other about concepts they are familiar with.
Jargon can help remove emotion when tackling difficult topics. Doctors, for example, often dehumanise patients by talking about a person in pain as an interesting case of some specific disease. Research shows that this helps create emotional distance, which allows them to make more reasonable decisions.
But this can also be problematic. In 1984 the US State Department replaced the word “killing” with “unlawful deprivation of life” in its human rights reports to help cover up the unpleasant reality of government-sanctioned killings in countries the US supports.
Jargon is also used to solidify a sense of belonging within groups. Professional wrestlers, for instance, talk about their sport as “business”, getting into the ring as “going to work”, and putting on a convincing performance as “selling”. Similarly, North American truck drivers use expressions like “bobtailing a twin screw jimmy” to purposefully exclude non-truck drivers from their conversations.
Resisting a full ban
The dangers of jargon have spurred frequent calls to ban it altogether. In 2015, the then British prime minister, David Cameron asked civil servants to ensure their communications were jargon free. In 2010, then US president, Barack Obama signed the Plain Language Act which required federal government documents to be written in a “clear, concise manner”. Presidents Nixon, Carter and Clinton all signed official orders requiring simple and plain language be used in government.
These world leaders were all following in the footsteps of George Orwell who in 1946 recommended that you “never use a long word where a short one will do”. But Orwell’s advice was preceded by Thomas Sprat, who in 1667 wrote how members of the newly founded Royal Society resolved “to reject all the amplifications, digressions, and swelling of style: to return back to the primitive purity, and shortness, when men deliver’d so many things, almost in an equal number of words”.
Despite these constant calls for plain language, jargon seems to have a habit of returning. Instead of trying to take on the impossible task of creating a jargon-free world, we might narrow our ambitions and just try to cut out what the scholar Russel Hirst calls “bad jargon”.
Some potential indicators of bad jargon are words that look or sound strange, hybrids or terms that are difficult to pronounce. After chasing out the bad jargon, we need to ensure that any specialist terms which are left are “good jargon”. That means they should be economical, precise and as universal as possible. Instead of fighting against all jargon, we should follow Russell Hirst’s advice and become champions of good jargon and its staunchest defenders.
Andre Spicer, Professor of Organisational Behaviour, Cass Business School, City, University of London
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Stephen Reidy, CIO, Three Ireland, and Karthik TS, Head of CoE, Torry Harris Integration Solutions (THIS)
Three Ireland’s digital transformation initiative “3Vision” was about creating brand loyalty and consistency through digitalization, after the merger with Telefonica O2 Ireland. As with any merger, the new entity was tasked with consolidating:
Three Ireland decided to use the merger as an opportunity to run a modernization initiative alongside the consolidation and rationalization of two different brands, over a 3-year period.
The transformation was planned in three phases:
Three Ireland partnered with Torry Harris Integration Solutions (THIS), who specialize in enabling meaningful digital ecosystems through API-driven integration, to build a strong integration backbone. This enabled Three Ireland to provide a consistent, streamlined omni-channel customer experience, despite complex changes happening at the back-end. The front-end and back-end systems were changing at different paces, requiring a lot of heavy-lifting to be done by the integration layer.
Torry Harris helped Three Ireland with:
Watch the full video to learn:
Tap into 2 decades of experience in legacy modernization and digital transformation: get in touch