How to build a culture of resilience

The technological side of remote work is the least of the problems of today’s business technology leaders, say Gartner analysts Daniel Sanchez Reina and Fabrizio Magnani: 

“People’s resilience is at stake, not only because of remote work, but also due to the shock and uncertainty of COVID-19 and economic stress that is shaking the foundations of company cultures.”

In a recent report, they cite the emergence of a group of office workers who feel disconnected from their remote colleagues, and a cohort of remote workers who feel disconnected with the rest of their team too. It’s an issue some New Zealand tech leaders have tackled.

Gartner’s solution: “CIOs must build a culture of resilience.” 

To do so, Sanchez Reina and Magnani list four cultural behaviours CIOs can incorporate into their IT organisational culture to achieve this goal.

  1. Create an environment of ‘open vulnerability’. This means staff bring their problems and fears out in the open, providing leaders the chance to tackle them.
  2. Identify ‘impact orientation’. The key is to identify what meaningful work is for each team member.
  3. Provide ‘intrinsic rewarding’. “Feed each employee’s main basic recognition need by identifying which of the three basic motivational needs of human beings prevail in them,” advise Sanchez Reina and Magnani. They note that people are governed by three major types of needs: achievement (sense of accomplishment), status (prestige and influence) and affiliation (friendly relationships).
  4. Foster an environment where a ‘sense of tribe’ goes beyond the IT organisation, by establishing permanent links with other parts of the company.

NZ CIO turned CEO helped teams manage anxiety at work and home

These insights resonate with Alin Ungureanu, a former healthcare CIO and now CEO of financial software provider Chelmer.

Over the past few months, as the pandemic crisis unfolded, Ungureanu says they held regular meetings —one on one, with the team, or the whole company. “The company meetings were focussed on updating everyone on our stability to give everyone certainty and remove personal and family anxiety,” he says. In these meetings, 15 to 20 per cent was focussed on business, and the rest discussed the concerns of staff and their personal accomplishments during the week.

“We involved in our session not just our employees but our strategic partners as well,” he says. “We recognise that we are in this together and we play our part by maintaining the high performance to support our clients’ businesses and employees.” He adds, “We had a common purpose, to continue to do the best for our clients and this provided stability and certainty for our company and our staff and family.”

During these times too, they learned to “accept less than best and less than perfect but not compromise company values, company culture and personal values,” Ungureanu says. “Emergency situations should be used as a motivator not as an excuse. The events brought us more together.”

NZ business coach explains how to achieve ‘open vulnerability’

Gartner analysts Sanchez Reina and Magnani advise leaders to “keep [their] senses wide open to identify concerns—personal or professional—that the team may not be sharing” with them.

But how to do that? Steve Griffin, a business coach and trainer, says the simple question ‘are you okay?’ can unearth these concerns. “Stoicism and the culture of ‘I’m okay’ can be a killer,” says Griffin, who became a business coach after executive roles in technology firms and as a military commander.

He relates that during Operation Southern Watch in the Gulf, he worked with a young corporal who was normally very vocal.

“As the operation progressed, I noticed that he was becoming increasingly withdrawn and sullen,” says Griffin. “I took him to one side on more than one occasion and asked if he was okay. He also replied that he was ‘fine’ and nothing was wrong. I had more than 200 people under my command and a very stressful job, so I must confess that I accepted the obviously cursory answer. 

“However, there are times when intuition or a ’sixth sense’ kicks in. I just felt that there was something deeply troubling this young man. The next time we spoke, I didn’t accept the ‘I’m fine’ response. I explained to him that nobody was fine, we had put ourselves in danger, many people were frightened. We had been separated from loved ones for over four months now and it was okay to be not okay.”

The young corporal broke down and cried, says Griffin. The corporal told him about his concerns about family. His brother had committed suicide only weeks before and his mother had been diagnosed with breast cancer. “He was beside himself with worry, unexpressed grief and the ‘shame’ of not being able to cope,” says Griffin. “I arranged for him to fly home on compassionate grounds and made sure that he had support at the other end.”

He believes this experience provides lessons for organisations working through the current crisis. “We are currently living through a pandemic, the future is uncertain both financially and personally,” says Griffin. “We may have loved ones overseas, elderly relatives or other at-risk family members. We are being asked to change the way we live and the way we work. Many of us are not ‘okay’. As leaders, we need to recognise this and be willing to act.”

This article was originally published in CIO New Zealand.

The author, Divina Paredes, is a New Zealand-based writer interested in #CivilSociety #SpecialNeedsCommunity #SocialEnterprise #Data4Good #ICTTrends #Tech4Good #Digital Workplace & #Sustainability. Reach her via @divinap  

Falling for the inevitable but not infallible AI

Why Dr Ayesha Khanna is advocating for, while keeping a watchful eye on artificial intelligence and other disruptive technologies

When Dr Ayesha Khanna talks about artificial intelligence, she does not start with technology.

“That’s the wrong way to start,” says Khanna, co-founder and CEO of ADDO AI, an artificial intelligence (AI) solutions firm and incubator.

“I start with impact.”

For her, one way to do this is to imagine Barack Obama as a law student.

According to her, if Obama were to enter law school today, he would most likely be studying the AI platform.

In fact, she says a professor at Yale Law School told her one of the most popular courses for their students is around robots and AI. 

She cites a possible scenario wherein a new lawyer is asked to research a case, to find similar cases and any precedents. This “grunt work” is not exciting for this lawyer.

Meanwhile, software from Ravel Law can go through all of the related cases that happened in the United States, find interesting correlations, such as who constituted the jury, the arguments, and decisions.

“This is now the kind of automation that we are seeing across all fields,” states Khanna.

She cites another case, the creation of ‘Ross’, which was touted as the first AI lawyer.

“Everybody was worried Ross would replace lawyers,” she says.

Ross can mine huge amounts of data and give insights to support humans, she explains. “It is important to note that human input is still part of the equation.”

“Yes, we need lawyers, but not need them trained the way they were or work the way they were. We want them to focus on what they went to law school, to think,” adds Khanna. “And we need them to understand how to work with machines.”

She stresses that law schools are also not behind the AI trend.

Harvard Law School, for instance, has started the CaseLaw Access Project (CAP).

This is a compilation of all precedential cases – covering 40,000 volumes of case law comprising some 40 million pages of text of cases from 1658 to present.

The law school has posted 360 years of US case law free and available on the internet for students in partnership with Ravel.

“This is an essential tool for lawyers in the future,” she declares.

For Khanna, these examples of how AI has changed a traditional industry like law present critical lessons for today’s business leaders on how they should view this set of technologies.

“If you don’t know about new technologies like AI, virtual reality, and Internet of Things (IoT), how can you innovate?”

She adds that, “Innovation today is driven and made possible by many of these technologies.”

“If you don’t move fast, someone unexpected will come after your business with the power of data and AI,” warns Khanna.

“I spend a lot of time educating senior executives and encouraging them and for their middle management to train the business users on the basics of AI.”

She tells them: “If you don’t have an AI-first approach, you are going to be disrupted by a competitor.”

“First of all, it is important to have someone technically savvy on your Board,” she advises.

“You need to have diversity on the Board. As more and more products are powered by AI, make sure they are not biased in any way.”

Diversity in the team is also important. “If you have women, minorities of all ages [in your team], they will notice this [bias] before the product hits the market.”

“You have to train your people around governance,” she states. “Looking out for bias is an ongoing, constant process.”

Khanna points out, too, that “Data you put in is important, oversight is important. Inspect your algorithm to make sure it is not biased.”

She has a similar message for those in charge of regulatory systems.

Regulatory panels should not just be composed of lawyers and politicians, but also technology experts and philosophers, she says.

“Get the experts at the table so they can truly inform what is happening right now because change is happening so fast.”

It takes a community

Khanna is emphatic about the need to upskill and empower everyone to use AI.

“No job now does not have or will not have someone related to AI and data as part of a team.”

She says it is likewise important to ensure programmes will nurture the talent to succeed in the AI-driven world.

“Every single thing my children do will be impacted by AI,” she states and notes that, “To have sustainable and equitable growth in an AI-powered economy, we need to guide and empower our nation’s talent.”

“You need to upskill and empower everyone. Your domain expertise and knowledge will be your advantage.”

At the same time, an AI engineer cannot go to any business or company and try to change it with the use of AI without understanding the domain.

“AI is your responsibility, your opportunity and your problem,” she points out, in a message addressed to leaders in government and the private sector.

“You need an interdisciplinary team,” she says.

She adds that the ability to collaborate is an important skill. “A lot of innovation will be led by people in partnership with the business.”

Khanna further says, “You will need to work with deep technology experts on AI, robotics and virtual reality. You need to think for your company, your customers and what matters to them.You need the basic skills to connect the dots, look at a problem and the solution. The glue between them is technology.”

“AI is inevitable,” she says. “You will run into the need for AI and data.

“You must be open to ideas and be able to work with people like AI engineers, and are empowered and confident to question their bias.”

She also says, “You are able to probe and make sure they reflect the values of your company.”

Khanna, thus, encourages everyone – no matter what age, gender or background – to learn a bit about AI.

“At the very least,” she says, “you can sign up for newsletters that will keep you in the loop of interesting things that are happening in emerging technologies.”

This way, “you can recognise it and feel empowered to be part of the AI-powered economy.”

Khanna, who has an undergraduate degree in economics from Harvard University and a doctorate in information systems and innovation from the London School of Economics, is personally involved in this advocacy.

She is the founder and chairperson of 21C GIRLS, a not-for-profit that provides free coding, artificial intelligence, and robotics classes to girls in Singapore.

She says the not-for-profit has taught over 5000 students in schools and community centres and is supported by a range of organisations like Google, VISA, Goldman Sachs, and PayPal.

She is also founder of Empower: AI for Singapore, a national movement to teach all youth in the country the basics of artificial intelligence.

Khanna, however, lists a range of issues that need to be considered as organisations move into the AI world.

She points out AI is not infallible.

“It is important for us to look at the benefits, but also think about the risks,” she says, citing “the importance of regulation, safety nets, and the human input on making final decisions.”

She notes that 5G is coming, connecting over a billion sensors around the world with IoT.

“But if everything is connected, you think of cybersecurity, and you think of data protection, as well. You should have the right to your information, and it should be easy for you to understand how the company is using your information.”

The author, Divina Paredes, is a New Zealand-based writer interested in #CivilSociety #SpecialNeedsCommunity #SocialEnterprise #Data4Good #ICTTrends #Tech4Good #Digital Workplace & #Sustainability. Reach her via @divinap