Data access and privacy as human rights
Show Notes for Series 02 Episode 09 Just a few short years ago, technologists were predicting that smart algorithms could determine our tastes and perhaps suggest other goods or services we might buy! Now, with that type of technology a reality, what’s the next logical step, and where does that leave our rights to… Read more »
By Joe Green | 28 October, 2020
Show Notes for Series 02 Episode 09
Just a few short years ago, technologists were predicting that smart algorithms could determine our tastes and perhaps suggest other goods or services we might buy! Now, with that type of technology a reality, what’s the next logical step, and where does that leave our rights to privacy?
Who owns our data? Is privacy a human right? How do companies work with each other, with governments and with non-profits to develop modern data frameworks that allow tech’s evolution yet stop us being changed by the few holders of big data?
The guest in this Tech Means Business podcast is Cisco’s Naveen Menon, the president of the company’s ASEAN operations. Naveen is also part of the World Economic Forum, where he advises, consults and guides companies and governments at the highest level.
Over the course of this podcast, we move from data privacy and anonymity issues to the ways that companies are now becoming both more aware of, and proactive in the fields of data management, protection of data rights and how data is shaping the near and far futures.
Naveen began his career as a management consultant, has worked all over the world, and now spearheads Cisco Systems’ in the APAC, pushing positive and constructive models of digitization for businesses across the region.
In today’s socially-democratic arenas, where anyone with a LinkedIn account can term themselves a “thought leader”, it’s refreshing to hear from a technologist who has something vitally important to say and a positive message to promote.
Definitely worth watching a YouTube segment or two, as recommended by Naveen. Start here, then follow those recommendations as made by the creepy AI algorithm:
Naveen Menon on LinkedIn:
https://www.linkedin.com/in/naveenmenon/
And Joe’s LinkedIn is here:
https://www.linkedin.com/in/josephedwardgreen/
Full transcript available.
Joe Green (host): Hello there, welcome to the Tech Means Business podcast. In this series of podcasts we discuss with leading members of industry in business all things technology. It’s the place I guess really where technology and business come together. And there aren’t really many areas of any business organisation that aren’t run to a greater or lesser extent by technology!
Today, I’m delighted to announce that I’m in very distinguished company. I’m joined by Naveen Menon of Cisco Systems. Naveen is president of the Asia Pacific region for Cisco and also newly-appointed member of the World Economic Forum. So absolutely marvellous to have such a well-rounded and knowledgeable guest. Welcome Naveen.
And for those of you who don’t know, who’ve been living perhaps in a cave for the last 30 or 40 years, Cisco is basically the company whose infrastructure has built the internet, largely, amongst others. It has been around for a long time, and is essentially the gold standard in all things networking, and these days, quite a lot more. So it only remains for me to say welcome to Naveen and as is traditional, or has become traditional on this podcast, could we start please by asking you to give us a little potted autobiography potted history of who you are, what you do and how you ended up at Cisco?
Naveen Menon (guest): Okay, so I’m an Indian, I was born in India, but I moved to the Netherlands when I was five. So I’m actually Dutch. I spent most of my life in Holland, I grew up in the Netherlands went through British education, and then a Dutch university education and started my career in General Electric, but then move quickly to management consulting, where I built my basically most of my career was out of AG Carney, which is a strategy consulting firm. I worked all across Europe and North America, and then moved to Asia in 2006, and then became a partner and then led the firm’s telecoms media and technology practice in Asia for several years, and then joined Cisco only about three years ago, as the head of Southeast Asia. So I oversee and I run the business in Southeast Asia, based out of Singapore,
Joe Green (host): Also Naveen, you’ve been very modest here. You’ve joined the World Economic Forum recently. Can you give us a couple of words about that?
Naveen Menon (guest): Yes, I actually been working with the World Economic Forum since 2013, when I was involved in two years worth of work with the forum, on the topic of personal data, privacy, data, policy, data for good, and topics like that. Kind of attended, as you know, kind of thought leadership, because I was a knowledge advisor to the west, and fed thought leadership into various forums and discussion meetings, and in Davos. And through the process, have been very involved in over the six, seven years, eight years, I’ve been very involved in work around data policy.
So back then, in 2013, when everyone was starting to go social, starting to build up their online profile and figuring out what’s going on in terms of, you know, new social media networks and the new data models, we were looking at how can we ensure data is, you know, adequately distributed worldwide, and there is access to data equitably, and we’re trying to figure out how to make sure that the people that need it the most could get it and that also, the rights of the individual were maintained.
At that time, there wasn’t a lot of thought put behind that kind of notice and consent rules. So for example, when you sign up to a new service, now you have to read 72 pages of legal docs, legal terminology: you have to accept and then only if you accept, you get the access service.
And back in those days, we were giving up a lot of information, and we didn’t even realise what we were giving up! So I actually put a lot of work back in those days, around individual rights and personal data rights. And that’s translated now into my current role where I serve on the data policy, global futures Council, looking at the same topic of how can we ensure data and appropriate policy frameworks are there for data to be distributed and safeguarded around the world.
Joe Green (host): Well, there’s a lot to unpack there. And I think to begin with, I’d like to talk about data policy, and also on an individual basis, and perhaps we could start with the difference or the distinction between privacy and anonymity? To begin with, do you think those two things are often mushed together, as it were, or confused? And do you think they’re separate issues that need considering?
Naveen Menon (guest): Yes, I think they are they are different issues. See, the topic of privacy is about what is a… you know… I believe is a fundamental human right. And what Cisco believes is essentially in human rights: the right to have a private life or private identity. It is a rights issue.
The issue of anonymity is a choice that you make. So if you want it to be anonymous, it’s important to realise that should be allowed too, you know, as the privacy framework for you to be to be to be anonymous. So that’s an individual choice. So they are two very different issues. One is about one of the fundamental human rights. The other one is about a choice that individuals need to make. But sometimes, they get confused with each other, which, I can understand because they’re quite complex topics.
Joe Green (host): It’s an interesting topic, isn’t it? This idea of a right to privacy, both on the internet, digital privacy, and in real life as well, in physical life. Do you think you’re one of those people that thinks the internet needs more regulation in order to promote the idea of privacy? Or are you okay with the internet being one of those self regulating bodies in that sense?
Naveen Menon (guest): Yeah, I’m not someone that believes that it needs more regulation. But I just think that we need to constantly be looking at regulatory frameworks and seeing if they’re fit for purpose at this point in time. I certainly don’t want more regulation. And I don’t think anyone wants more regulation, I can’t imagine more. What we want is to have every year, in an ideal world, almost every instant, the right to see: do these regulatory frameworks makes sense? And can we change them?
You know, these regs, a lot of regulatory frameworks were built back in the industrial age, and they haven’t necessarily changed. So I think we need to look at relevancy, fit for purpose, and we need to basically evolve regulatory frameworks.
Joe Green (host): Isn’t that part of the problem, though, that in terms of regulatory frameworks, they tend to be imposed nationally, or even regionally? Sometimes transnationally, or internationally, for instance, in the American alliances, or in the EU? And is that something that the World Economic Forum is pushing towards? In that you can regulate all you like in Singapore as a standalone or in the UK, but because the internet’s an international network applying regulatory frameworks is actually very difficult.
Naveen Menon (guest): Yeah, it is. And look, it’s not just about the world there. I mean, there’s many institutions that are doing great work in progressing and improving regulatory framework. Where I think the World Economic Forum does a particularly good job is it engages at the CEO and at the heads of state level, to make these very influential people aware of the issues.
And what I found is that once these CEOs and heads of state, and heads of civil society are engaged in a dialogue, then you can actually get better understanding. And once there’s better understanding, then that leads to more informed decision making. And once there’s more informed decision making, you need regulators and public policy professionals to understand the law and to then put in place policies and legal frameworks underneath it.
So it’s a whole ecosystem of change, you know, so where the World Economic Forum role is important is that it engages at a certain level of the organisation, whether it’s a public sector, organisation, or private sector organisation, or a not for profit, or for a university to progress the debate, and then, have a more informed dialogue. And that’s the purpose of of that level. But we do need policymakers and legislators in the government and in the private sector to be involved in drafting and writing regulation that is more fit for purpose and more practical.
Joe Green (host): Yeah, I agree. I think that to a certain extent, this is clearly a personal opinion, and not one endorsed by my employer, necessarily, is that I think the key is between governments and private organisations. And of course, those private organisations don’t have to be businesses. They can be nonprofits. They can be on open source foundations, if you like, and as part of your work with governmental initiatives in the Asia Pacific region in which Cisco and you are involved, can you tell us a little bit about that synergy between governments and independent bodies, around digital strategies? What’s actually happening on the ground that is being done to further those partnerships?
Naveen Menon (guest): Yeah, so what we do is we get some of those public sector institutions our very close customers and partners in (we are very close to governments around the region and around the world). So, that gives us a very privileged position to influence and to also support decision making on matters of policy. So in some cases, we do that directly, and others, we do that through independent parties as well.
So for example, I also chair the the US ASEAN Business Council, which is an extension of the Foreign Office of the United States. We represent American interests in the region here in Southeast Asia with all the regional governments in Southeast Asia. And so I chair the US ASEAN Business Council, and I represent Southeast Asia and I represent the ICT community. So I represent all the major companies in the technology sector, some 400 of them, on matters of policy with the ASEAN governments here and in Southeast Asia.
There are many ways in which we can engage, to drive matters or at least influence matters of policy, and you’re finding a lot of corporations are getting involved in trying to make a wider societal impact. And this can only be done through partnership with the public sector.
Joe Green (host): is the problem or is a problem one of speed, do you think? Again, this is a personal opinion. But, it always seems to me that legislation takes ages to pass through national legislatures, through the executive and all through the due processes. So it can take, let’s say, two years for a piece of legislation to be proposed, written out, presented, go through the processes and eventually become law. And, of course, two years in technology is a hell of a long time. So is that part of the issue, that sort of glacial speed of governmental process?
Naveen Menon (guest): I think it’s not. I mean, I think there’s no expectation from the corporate sector, to move, make decisions quickly, or to move things quickly. I mean, I think these are typically very complex problems that require, multi stakeholder kinds of conversations.
So for example, I just gave an example in Indonesia, but 13 million people in Indonesia, and there’s about 200, and 50 million people in total, don’t have access to internet in 12 and a half thousand remote villages.
A third of the Indonesians, too. So basically, when COVID hit Indonesia, everyone, and the government locked down, which was the right thing to do at the time, and [said] that everyone needs to work from home, or study at home. The problem is that more than a third of Indonesian students don’t have this. In Indonesia they don’t have internet access or access to an affordable device in order to study. So there’s a lot of children that basically didn’t go to school and therefore are being left out of the education system.
So how do you fix that problem? I mean, I think everyone wants to fix that problem! I can’t imagine anyone, public sector, private sector, nonprofit individual, that doesn’t feel that that’s a fair thing. No one thinks that’s fair, everyone wants to fix that problem! But this is a very complex problem that requires lots of conversations, and lots of discussions and lots of trials and lots of attempts before we can get it right. But we need to keep at it and be persistent. And we need to keep raising the issue. And, we need to find alternative ways to solve that problem. That’s where I think the biggest challenges is currently: […] we need to build a bigger generation of problem solvers.
You know, so we don’t need more regulation, what we need is more effective regulation, in order to be more effective. We need to have more effective policies and more effective policies are actually done by problem solving: what is the issue? Then, identifying hypotheses that can address the issue and then testing them through, testing alternative approaches. And that’s where I think we have the biggest problem right now; is that we were failing in the ability to experiment and try out new things.
Joe Green (host): Now, just to circle back to the issue that you mentioned there, about 13 million Indonesians with no internet access. It strikes me that what we need to probably promote as a human right, is access to information and access to the internet in the same way that we stress provision of and access to education, clean water, lack of disease and basic shelter. Do you feel that we should be giving, as a human right, internet access? And also, then as an adjunct to that, protection from harm on the internet?
Naveen Menon (guest): Absolutely. I think so. There’s an absolute merit in what you’re saying.The challenges is in implementation. So the challenges are in the definition of who plays what role? The challenge is, in the definition of, where does this original content exist, who is holding that content and who’s distributing that content?
It’s become a very distributed kind of world and and a delegated world. And so the ability to hold institutions accountable is getting less and less implementable. And so that’s why I think the right of the individual to have this right to privacy, the right of the individual to go anonymous, or, the choice of the individual to go anonymous, is fair; a very important step. A vital first step.
There are many cases of internet access being given, but in exchange for other benefits, if you like, that are provided to the provider of the internet, right, which the individual doesn’t even necessarily know that they’re giving up. I don’t know if that makes any sense or not. I can give an example. If you like?
Joe Green (host): No, it makes perfect sense. I think essentially what you’re saying here is that there’s a quid pro quo. And it always happens to a certain extent. I mean, I’m sat here in my home office, and I’m paying an internet service provider for me to access the internet. And unless I do something about it, my ISP can also sell information about where I am and what I do and where I go on the internet. And that’s available for them to sell on to third parties. And it’s kind of an unspoken agreement, really!
And that reciprocation, I think happens at all levels, doesn’t it? Is that what you’re saying? You can give everyone in a school a $100 laptop, and yet you enter into an arrangement that school then has to use Office 365! Is that kind of quid pro quo arrangement the thing that you’re talking about here?
Naveen Menon (guest): That’s right. And I don’t want to name any names. But I mean, let’s just give an example. And this is the one that was the most compelling to me: it’s on YouTube. So you can you can look at it yourself, and everyone listening can listen to it. But look at Yval Harari’s speech at the World Economic Forum in Davos in 2018. At the opening address was a half an hour speech. Very, very compelling. Right? What he basically says is he talks about his own personal story. And he gives a hypothesis at the end of it. So what he started by saying is that he starts by saying that when he was 17 years old, he thought he might be gay. But it took him a process of six years, to come to terms with his own sexuality. And by 23 years old, he knew he was gay.
What he argued was, and this was the hypothesis that he put forward, and there will probably be about four or five companies and, and maybe some nations on the planet today, that could definitively tell him when he was 17 years old at a very impressionable age, you are or you are not who you are. I mean, they could definitively tell you that right now. And that challenges the notion of identity to the deepest core of who you are.
So what does that mean? That means that there are companies out there that know more about you than you yourself. I mean, this is a real issue. So there is so much data out there, concentrated in so few institutions, that they can essentially, if they wanted to, and if they were left to their own devices, could essentially dictate life as we know it, or could guide life as we know it.
Currently, it’s in the form of enhancing preference or generating demand. But in the future, it could be down to the core of identity. Who am I? Who am I as an individual, can I be reprogrammed?
He did a similar talk earlier this year with I think the graduating [class of] Georgetown University, where he talked to the students and he said that he talked about hackable animals. At the end of the speech it became very clear and evident that the animal was the human being and that the animal that he was referring to was the graduating class. So he was postulating to the graduating class, are you going to be that hackable animal? Or are you going to finally take control of what is your identity?
Joe Green (host): And I think there’s a lot going on at the moment in the mainstream media about the realisation that humans are being hacked: if you like, it’s the hackable human. And it’s not an immediate thing. It’s not having an advert flashed in front of you one day and then buy something the next day, it’s not as simple as that. I think it’s more long term. It’s more insidious, if you will, although that word has negative connotations. But I think that if we’re talking over longer periods of time, maybe five to 10 years, people can and will be influenced. And I think that’s something really that we are becoming aware of, and perhaps are beginning to accept, correct?
Naveen Menon (guest): And let me ask you, Joe, right, I want to ask a question to you as well. What would you do if three years from now a company, I don’t know what company that is a startup, a SaaS provider (software as a service provider) comes to you and says, here’s a device. You can you can wear it or you can implant it into your body. If you wear it for a period of a year, and if you pay this amount of money, I guarantee to you that I will extend your life by 10 years. You probably say yes!
Joe Green (host): Absolutely. You’d probably say yes.
Naveen Menon (guest): So that’s the issue. That’s a very compelling value proposition. You know, you pay $1,000, maybe in a subscription per year, per month, let’s say, over a course of a year, that’s $12,000 or pounds [sterling]. And they guarantee you that you could live another 10 years because they’d be collecting all your biometric data. And that biometric data would be fed into an algorithm, that algorithm would be able to tell you exactly what to eat, when to sleep, how to behave, where to go. And that would essentially enable you to live 10 more years of your life right? Now, pretty compelling value proposition, but it poses a very ethical question. When is it too far?
So it’s fine to talk about preferences, it’s fine. For what? Okay, I think I’ll get that green garden furniture? But that’s what we know now — that is essentially what we know now. Even in 2013, when I started this work, and we didn’t even know that back then, we didn’t realise that algorithms could do that! We didn’t realise that companies or SaaS companies could do that. But we thought it might be possible that we may get preference data, we may be able to create demand. At this point in time, we know that the preference data is there, we know that we can create demand through these algorithms. But there’s a hypothesis that we could actually move into more areas of conflict, of ethical conflicts in the future. And this is where this is what concerns me right now. The ethics are, of course, key to this.
Joe Green (host): Do you think it’s essentially a question of human nature, human nature writ large, throughout these enormous data centres the size of football fields? And do you think that there’s something askew here? Obviously, we’re doing a lot of things with AI, and we’re using these things at the moment to essentially sell things or sell power; sell influence? Is it the case that technology is just so young? We haven’t realised its capabilities yet?
Naveen Menon (guest): I like to be positive about it. I mean, I’m in the technology industry. And I feel that the technology has got tremendous power to do good.
So for example, back in 2014, we were we were looking at working with the three mobile operators in Western Africa to halt the Ebola virus. So we looked at data. And we looked at macro data at a top level to see how population movements were working, so that we could put medical posts in the right place to stop people interacting with each other, and then giving people a buffer period, a buffer distance, so that they didn’t spread the virus further than that.
And that that was very difficult. I mean, we managed to get one out of the three mobile operators to provide that data. So I think there are very, very good cases of where data can be used for good. Similarly in global warming and climate change, in inclusion, and in increasing education, health care, etc., space exploration […]. There are many fields in which data can be used well, but it goes back to my original point.
When you asked me about regulatory frameworks, do we need more regulation now? And I said, no we don’t need more regulation. We need more fit-for-purpose regulatory frameworks. So the regulatory frameworks we’re using right now, don’t support this innovation on the one hand.
But if it doesn’t also support the unintended consequences of bad actors that manipulate the ecosystem, and [they] do some of these bad things that we’re talking about, and that leads to ethical dilemmas, potentially, in the future.
So whatever we do, we need to be aware that there are benefits, but there are significant consequences. And there are things that we are giving up about ourselves, about our companies, about our families, when institutions can potentially do harm as well as good.
Joe Green (host): I think that the most obvious I guess, in terms of what you’re defining as bad actor is that there’s a hacker, and it could be an organisation, maybe an organisation of hackers, engaged in criminal activity, stealing intellectual property, and maybe that might even close a business down.
What we’re talking about here is the way that data can be misappropriated. I see from your biography that a lot of your work or some of your work is around advice to companies and organisations on their ongoing digitization. Do you think there’s a gap between what people expect technology to deliver them, and actually what the reality is, which can be quite a lot more threatening and unpleasant?
Naveen Menon (guest): Yeah, I think I think there’s always a gap. The companies that we work with have very strong ideas about what they want to do to support their constituencies, their stakeholders, their shareholders, whatever it is. And so we like to work with them to meet those needs.
But what we find along the way is that technology can sometimes meet those needs, sometimes it can’t, sometimes it requires fundamental process change, or fundamental policy change, or fundamental people change in order to implement that technology to get the right outcome that the institution is trying to get. I think that the gap is there, but it’s, it’s about working with people in educating and creating more awareness about technology and what it can do for solving the outcomes that you’re looking for. There’s a lot of conversations that are needed. Now, the technology is getting so complex, every day. So it’s constantly changing.
Joe Green (host): And clearly, therefore, you see a role for the private sector to play in that process.
Naveen Menon (guest): Yeah, absolutely.The knowledge economy is growing and growing. And, you know, corporations involved in the knowledge economy have got a very strong part to play in shifting mindsets, but also ensuring adequate learning is in place so that companies, other companies or governments know about the benefits and the risks.
So definitely, you’re finding a lot of activism. And we’ve seen a lot of kind of corporations, I think, about 178 companies, the CEOs of companies in the US signed up to the concept that it’s more about stakeholder capitalism than shareholder capitalism, right? Especially technology companies are seeing that [shift] wider: their ecosystem is a lot wider than just their shareholders. So definitely, companies have a role to play.
Joe Green (host): So obviously, at the moment, there are no physical events going on at the moment, nothing. But if I were to put something in the show notes down at the bottom of this podcast, as a call to action…if you could wear two hats at once, if you like, your Cisco hat and your consultative hat, what would you urge people to go and do next? Those who want to read up and maybe get a little bit more media about some of the issues that we’ve talked about today?
Naveen Menon (guest): Yeah, a few things I would say. Firstly, I would say constantly: make sure that you are re-skilling yourself. Because what you’re doing now will certainly be very different to what you’re doing in a couple of years time and what you need to do to be relevant in a couple of years time.
The best example I like to give, particularly when it comes to the use of data is driving a car. I don’t know what the demographic of your audience is but I remember my dad (who just recently passed away) his car: the first car that he bought when we moved to the Netherlands. It didn’t have much to look at, you know, it was a great car, but it didn’t have much. You looked at your steering wheel, you had a speedometer. It didn’t have a rev counter, but it had a radio and it had a gas tank, and it had a temperature meter. And that was pretty much it. And stick shift! But that’s pretty much it in terms of data that you got when you were driving. Of course you hit the indicator, you’d see the light blinking as well — which was great! But if you accept into account now it’s a completely different world.
So if you can imagine just someone back then time warping, and then coming forwards into the future if that was at all possible, and sitting in a car, now, they would become a little bit lost, I think, because there’s so much data coming at you, but we’ve gotten used to it. So I’m a big believer in human progress, like I believe people can rescale themselves to take advantage and learn about how to adopt data into everyday life.
Just by nature of this change that’s happened, at least by this little example, in the automotive sector, right? So I think I have to constantly rescale yourself.
You’ve seen the same with factory workers, you know, all agricultural workers, which is a big part of the Asia Pacific economy. They’re doing less manual work, more operator work, more responding to data. And then the sorts of manual work that they used to do in the past. So that’s it; that’s a big thing. Like number one reskill yourself.
The second thing I would say is, ask questions about your providers of your technology solutions, right? Don’t be afraid to ask simple questions. There’s a lot of smoke and mirrors, I think, in many technology companies around trying to overcompensate and make things a little bit too complicated, because maybe they didn’t want to answer the tough questions, I don’t know…
But I think you should ask questions of your technology provider: where’s my data being held? Who accesses my data? Why is this service that I’m using free? How do you get paid, […] if you’re not charging me anything for it? What can I do if I want to cancel my service? How do I cancel my service? How do I make sure that when I cancel my service, all my data comes back to me? Because it was my data?!
Or is it mine when I sign up to a service? Do I lose my data? Is my data becoming your data? Right? Ask questions like that. I think very few people actually ask questions like that, especially of my generation.
I think my kids’ generation, they they’re a little bit more comfortable with asking those questions, because they’ve grown up with it.
Then the last thing I would say is, you citizens need to basically hold institutions accountable. And need to basically make one small change every day. So if do ask those questions, and it’s important to also responsibly choose technology vendors for whatever you do, right?
If you believe that a technology vendor is inherently ethical, and they have taken steps to demonstrate their ethics. I think we’re getting into a much more nuanced world where I think access to capital is cheap, access to technology solutions is low [bar], switching costs can be very high. But if you feel that a tech vendor is really not ethical in terms of the way they treat you or your data or your identity, I think it’s time to make a change.
At Cisco, I think we made a very, very strong commitment towards privacy, we have recognised that privacy is a fundamental human right. We genuinely believe that it is good for business. But it’s also, most importantly, it’s good for the individuals and for the customers and our customers.
I think that institutions and corporations need to recognise that this privacy is important for them, and for that going concern as a company. Because I think that if you find that your data is compromised in any way, shape, or form, that could potentially put your company out of business.
And that has not been the case in the past, where a breach or a breakdown in IT service has led to this kind of drastic effect. But that does happen nowadays. If you lose your account, if you lose your institution’s data, or if your customers’ data is compromised, that could be the end of your company.
So I think that you need to take action, if you see that there are vendors that you’re working with that are not living up to the values that they portray. So those are the three things I would say.
Joe Green (host): Naveen, it’s been absolutely fascinating talking to you today. But as you can probably tell by the fact that the producer is gradually bringing the music up signals rather (and not very subtly, in my opinion), that we’ve run out of time today. So thank you very much Naveen Menon of Cisco Systems for joining us today. And thank you, dear listeners for joining us too! I hope that you can join me on the next issue of the Tech Means Business podcast. See you soon. Bye!
By Joe Green
Joe Green is a writer based in Bristol, UK. He bought his first Mac and dial-up modem in 1992 and has worked in the tech industry since 2000. He specialises in networking, open-source, online privacy and data security.
READ MORE
- Is the Carsome unicorn status in Malaysia overhyped amidst recent layoffs?
- Managing cybersecurity risks caused by employees can be as harmful as hacking in APAC
- Fintechs leading the change for AI adoption in risk and compliance
- Gaming to learn – the latest in AI education
- Manufacturers solve the puzzle to achieve both growth and profitability: Better ERP support