Securing data provides Canadian online bank rapid path to new credit card business

The next BriefingsDirect data and security transformation use-case scenario describes how Tangerine Bank in Toronto has improved its speed to new business initiatives by gaining data-security agility.

We’ll now learn how improving end-user experiences for online banking and making data more secure across its lifecycle has helped speed the delivery of a new credit card offering.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or  download a copy.

Here to explore how compliance, data security technology, and banking innovation come together to support a digital business success story is  Billy Lo, Head of Enterprise Architecture at Tangerine Bank in Toronto. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: First, tell us a little bit about digital disruption in the banking industry. Obviously, there are lots of changes in many industries these days, but it seems that banking is particularly within the cross-hairs of disruption.

Learn More About Safeguarding
Data Throughout Its Lifecycle
Read the full Report

Lo: No doubt about that. Our bank used to be known as ING Direct. It started in Canada about 20 years ago. Our founders initially recognized this need and started a journey. Since then, we’ve been full-speed ahead. We’re seeing the savings that we get out of being branchless and passing that back to the clients. That message resonates very well with our client base, and so far, so good.

Gardner: When you say online banking, there are no branches or no brick-and-mortar buildings with the word “bank” on the front. It’s all done by mobile, via online. Am I missing anything?

Lo: On top of the fully digital experience, we also actually dive into a little bit of the physical as well, but not in a traditional way.

At Tangerine, we have a couple of other in-person kinds of channels. One is what we call a café. In an informal setting, you can get a coffee or orange juice at the café and get some advice. But most of the functionality is available through the digital channel, through a tablet onsite with someone guiding you along the way.

We’ve recently been exploring a concept called Mobile Pop-Ups, but not at malls. We refurbished containers and put them into different location to introduce the concept within your bank to different geographies. We also found that very rewarding, because you can reach many people online, but there are still some who need a little extra nudge to get comfortable with starting a banking relationship online.

User expectations

Gardner: That brings up an interesting topic. User expectations are also rapidly evolving in our world. Is there something about somebody who is attracted to online banking that you need to be aware of? Is there something about speed or agility? What is it about the banking customer who prefers online that you need to cater to?


Lo: Dana, you’re right on the point. In this case, both speed and agility are expected from a bank that highlights their services in terms of user experience online. They’re now used to the Gmail inbox, Facebook, instant messaging. The good old days of submitting a form and waiting for someone to come back to you is gone, really gone.

From an expectation point of view, we’re heavily impacted by the consumerization of technology. All those things that you see on a smartphone, taking pictures and depositing a check, are almost, as we call it, table stakes. We have to work harder at inventing things that surprise and delight our clients.

Gardner: Of course a big part of being able to delight your customers is to know them and have data about them that you can use to allow services to be customized and personalized. So data is essential, but at the same time, you’re in a highly regulated business where privacy issues and security are big concerns. How are you achieving the balance between data availability and data protection?

Lo: We in the banking business are in the business of trust. In everything that we do, trust has to be number one. We have to be ready for any kind of questions from our client base on how we handle the information. There’s no doubt that transparency will help, and over time, with transparency, our clients learn that we’re up-front in how we’re using information. And it’s not just transparency, but also putting the information in a way that’s easily understandable up-front.

If you look at our registration process, one of the first thing that we tell people is “Here is our not-so-fine print.” It’s in big, bold fonts and that’s very important, because especially in a digital bank, a lion’s share of the interactions are through non-face-to-face kind of interactions. If you invest the time in being transparent, invest the time in building up your security infrastructure to protect your information, and be vigilant about all of the current things that are happening. It can be done.

Gardner: Tell us a little bit about your journey toward this new credit-card offering and why putting the blocks of infrastructure investment in place in advance is so important for agility and for quality of service in a new offering.

Lo: Let’s take this journey back a little bit as far as our credit-card offering is concerned. We started out as a savings bank and highlighted our high-interest offering at the beginning. That resonated well, and we quickly recognized the fact that we’re going to need to expand our product offering. People actually wanted to use us as an everyday bank.

Unfortunately, at the time, we didn’t have the complete suite of products that our clients would need. So, over time, we built up with mutual funds, investments, and mortgages, and the last piece of the puzzle is credit cards. Once we have that, we can officially say to everybody that we’re not just a peripheral bank, but have real full-service functions that you can have to support your everyday life.

In our case, efficiency and the speed of adoption is key. Every month that we wait for this offering to come out of the door, we’re losing opportunities to turn a regular client into a full-service client. So, we were starting from scratch. We had zero infrastructure. We hired. We built up the technology behind it, partnered with a few of our trusted partners to build up the infrastructure, but the foundation does take time to do it right.

Foundational effort

One thing that not a whole lot of people understand is the foundational effort. If you spend a month or two on building up the right foundation, the saving going forward is actually exponential. With HPE, we adopted the tokenization solution to help protect [credit] card number information. We were able to complete the whole journey in a very quick fashion. That saves us a lot of time, because everything revolves around the card number. If we don’t get the foundation done right at the beginning, quickly, the cost and schedule impact is exponential.

Gardner: So quality is important because you want to get it right the first time. It’s not just doing it quickly; it’s also doing it correctly. If you have to go back and redo infrastructure, that can be a huge tax on your innovation and really put a cultural drag on how things proceed.

Lo: Right on, and I don’t even want to think about it. Seriously, on the adoption of these foundational components, speed is key and that saves us a lot of hassle going forward in conversion as well as data cleansing. Once the cat is out of the bag, if you will, it’s so much harder.

Gardner: Billy, I’ve heard from other organizations that recognize that moving data around in the old-fashioned way doesn’t work. Being PCI compliant, having privacy issues met, in fact, having less data and detailed information about a customer is much more desirable. Is that the case with you and the tokenization process and encryption use? How would you describe about what data to keep, what data to transact, and what’s the right balance?

Learn More About Safeguarding
Data Throughout Its Lifecycle
Read the full Report

Lo: Just as any other security person would tell you, you have to know where the walls and the doors are with security information. We made a conscientious effort in identifying where we would need the actual card number available, such as for collections or for some operational process, and identify who needs them, where the door needs to be, and then lock them up. Tokenization allows us to do that without too much overhead, and overall, our experience has been definitely well worth investing the time.

Now, I have one place to monitor, and one door to monitor. As soon as I allow access to that information, I’ll have an audit trail of who accessed what, when, and how. That gives me the comfort level that I have. Our clients specifically demand it, both on the business side and the front-end client point of view. They appreciate that.

Gardner: For some of our audience, who are not security folks per se, describe what secured data and stateless tokenization means. How does that work — just an idea architecturally of how this actually works?

Lo: Imagine your card number or any kind of personally identifiable information (PII) that is important to you. Think of it as a piece of fruit, an apple, and you pass it around identifying yourself. Tokenization, and the Stateless Tokenization technology that HPE offers in particular, is that you have an exchange process. The middleman takes your apple, turn it into a pear through a specific algorithm. The reverse process can be applied when someone gives me a pear and ask for an actual apple; the visual is coming back to you.

So, every time, every piece of information that is passed along in the message exchange, they go through this process. The key term here is stateless, of course, so that we don’t have a rack of this mapping information stored somewhere, which becomes yet another vulnerability. That makes our operations a lot easier, especially in a multi data-center environment.

Gardner: So, you get the use of that tokenized data, but you don’t have to store it. It’s not in the state in different places that then have to be protected. There are fewer spots where somebody could be liable to expose it or get access to it.

Difficult to guarantee

Lo: No doubt. In fact, if you think about a larger-scale environment where pieces of information are stored in the cloud, in multiple data centers, in some cases, you may not even know physically where they are. It’s very difficult to make that guarantee and say that we know where our information is, and that’s just online. There are the backups that are necessary to run a successful operation.

Gardner: We’ve heard now that you started from scratch with your credit-card activities. You put in the necessary infrastructure, recognizing that doing it right and fast is a great saver over time. Tell us a little bit about the actual credit card project. How has it come about, and where are you in its delivery to the market?

Lo: It’s been very exciting for us. Our clients have been looking forward to this. We started a public launch in March this year, after about six month’s trial within the bank and with some selected clients. We’re now full force and we’ve been running campaigns.

How do we do this? How do we attract our clients? First of all, being transparent. Our product features are very specific, and we don’t hide the interest rate. We’re very upfront about fair fees. We’re offering promotions right now in three categories. We have four percent cash back for our product, which is a very attractive offering that the market is looking forward to. It’s been working really well.

Gardner: And what’s the name of the card? Is it just Tangerine Bank card or is there a branded name to it?

Lo: There is nothing fancy about it right now. This is our only card; so, it can’t go wrong.

Gardner: And is it both debit and credit?

Lo: We’ve had a debit card for a while now. In this case with the credit card, we have the technology behind it that uses the typical chip-card infrastructure as well as the MasterCard PayPass Tap and Go. And we’re also venturing into a mobile payment in the very near future.

Gardner: That was my very next question. Now that you’re a full service bank online, more and more people are wondering how to automate this payment process, particularly with a mobile device. We’ve seen other organizations attempt this, but it doesn’t seem to have gone mainstream yet. Tell us about what you foresee for mobile payments and how you think you might be a leader in that market?

Learn More About Safeguarding
Data Throughout Its Lifecycle
Read the full Report

Lo: In the Canadian marketplace, the merchant landscape is very different from most other geographies. Well over 80 percent of the merchants in the Canadian marketplace are already Tap and Go and chip ready.

With the adoption of mobile payments in big-vendor environments such as Apple Pay as well as Android Pay, we’re very, very optimistic. Tap and Go is already a significant component of the payment process, especially for small amounts. Naturally, this is just an extension, whether it’s through the mobile phone or your watch. The impediments that other geographies have around merchants’ reluctance or infrastructure constraints doesn’t really exist in the Canadian market place. So, we’re ready.
Extra distance

Gardner: It seems to me that, given the emphasis on user experience and convenience, those organizations like yours that go the extra distance and make that user experience simple, transparent and worthwhile in terms of convenience and productivity, that customers will just put more and more and more of their transactions into that card. It could become so central to their lives. Is that part of your strategy?

Lo: Yes, in many ways. The Tap and Go payment process, once the merchant environment supports it, is very, very efficient. The more information we have around where our client is spending their time, the more we can customize our offering to cater for their specific needs and personalize insights that support their everyday life. No doubt about that. In fact, speaking of the credit card offering and differentiation factor, one of the things that we made very clear is that convenience comes with a cost in terms of people’s comfort level in using that product.

Now, if you lose your phone, what’s going to happen? We made it a very high priority to enable our client to freeze the card very easily. Let’s say, if I leave my card in a restaurant, I just pick up my phone or go to an Internet-connected device, freeze my card — don’t cancel it, freeze it — until I find it. So, we take quite a bit of time in exploring and making sure that people will feel comfortable using this new channels.

Gardner: It sounds like we’re only just scratching the surface on these ancillary services that could be brought to bear when you have the underlying infrastructure in place, the security and data availability in place. It’s going to be interesting in the next several years how convenience can be even completely redefined.

Lo: Yes. We can’t wait to continue to innovate for our clients, and in many ways, our clients are looking forward to all of these things as we progress. Banking is our everyday life.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or  download a copy. Sponsor: Hewlett Packard Enterprise.

You may also be interested in:

Posted in HP, Cyber security, Security, Hewlett Packard Enterprise | Tagged , , , , , , , , , | Leave a comment

CPO expert Joanna Martinez extols the virtues of redesigning procurement for strategic business agility

The next BriefingsDirect business innovation thought leadership discussion focuses on how companies are exploiting technology advances in procurement and finance services to produce new types of productivity benefits.

We’ll now hear from a procurement expert on how companies can better manage their finances and have tighter control over procurement processes and their supply chain networks. This business process innovation exchange comes to you in conjunction with the Tradeshift Innovation Day held in New York on June 22, 2016.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or  download a copy.

To learn more about how technology trends are driving innovation into invoicing and spend management, please welcome Joanna Martinez, Founder at Supply Chain Advisors and former Chief Procurement Officer at both Cushman and Wakefield and AllianceBernstein. She’s based in New York. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: What’s behind the need to redesign business procurement for agility?

Martinez: I speak to a lot of chief procurement officers and procurement execs, and people are caught up in this idea of, we’ve got to save money, we’ve got to save money. We have to deliver five times the cost of our group, 10 times, whatever their metric is. They’ve been focused on this, and their businesses have been focused on this, for a long time.

The reality is that the world really is changing. It’s been a 25-year run of professional procurement and strategic sourcing focused on cost out, and even the most brilliant of sourcing executives, at some point, is going to encounter a well that’s run dry.

Sometimes you work in a manufacturing company, where there is a constant influx of new products. You can move from one to another, but those of us who have worked in the services industries — in real estate, in other kinds of businesses where a tangible good isn’t made and where it’s really a service — don’t always have that influx. It’s a real conundrum, a real problem out there.

I believe, though, that events and these changes are forcing the good, the smart procurement people to think about ways they can be more agile, accept the disruption, and figure out a way to continue to add value despite of it.

Gardner: So perhaps cost-out is still important, but innovation-in is even more important?

Changing metrics

Martinez: That’s it, exactly. In fact, I have seen some things written lately. Accenture did a piece on procurement, “The Future Procurement Organization of One,” I think it was called. They talked about the metrics changing, and that procurement is evolving into an organization that’s measured on the value it adds to the company’s strategy.


People talk a lot about changing the conversation. I don’t think it’s necessarily changing the conversation; it’s adjusting the conversation. After you’ve been reviewing your cost savings for the last five years for your CFO, you don’t walk in one day and say, “Now we’re going to talk about something else.” No, you get smart about it, you start to think about the other ways you’re adding value, and you enhance the conversation with those.

So, you don’t go from a hundred to zero on the cost savings part of it. There’s always going to be some expectation, a value added in that piece, but you can show relatively quickly that there are a whole lot of other places. [See related post, How new modes of buying and evaluating goods and services disrupts business procurement — for the better.]

Gardner: While it might be intimidating to some, it seems to me that there are many more tools and technologies that have come to bear that the procurement professional can use. They have many more arrows in their quiver, if they’re interested in shooting them. What do you think are some of the more important technological changes that benefit procurement?

Martinez: Well, there are all these services in the cloud. It’s become a lot cheaper and a lot faster to move to something new. For years, you’ve had a large IT community managing the disruption of trying to put in a product that’s integrated with every piece of data and servers.

It’s not over, because lot of those legacy systems are there and have to be dealt with as they age. But as new services are developed, people can learn about them and will figure out ways to bring it to the company. They require a different kind of agility: It’s OPEX, not capital expense. There is more transparency when service is being provided in the cloud. So some new procurement skill sets are required.

I’m going to speak later tonight, and I have a picture of an automobile assembly line. It says, “This is yesterday’s robot.” When you talk about robotics, people think of Ford Motor Company. The reality is that robotics are being used in the insurance industry and in other industries that are processing a lot of repetitive information. It is the robotics of technology. The procurement organization knows these suppliers and sees what the rest of the world is doing. It’s incumbent upon procurement to start to bring that new knowledge to companies.

Gardner: Joanna, we also hear a lot of these days about business networks whereby moving services and data to a cloud model, you can assimilate data that perhaps couldn’t have been brought to bear before. You can create partner relationships that are automated and then create wholes greater than the sum of the parts. How do you come down on business networks as a powerful tool for procurement? [See related post, ChainLink analyst on how cloud-enabled supply chain networks drive companies to better manage finances, procurement.]

Martinez: Procurement has to get over the “not invented here” syndrome. By the way, over the years I have been as guilty of this is anyone else. You want to be in the center of things. You want to be the one at the meeting with the suppliers coming in and the new product development people at your company.

The procurement organization has to understand and make friends with the product development and the revenue-generating side of the business. Then They have to turn 180 degrees and look to the outside world, and understand how the supplier community can help to create those networks, then move onto the next one, and then, be smart enough in the contracting, and in things like the termination clauses to make sure that those networks can be decoupled when they need to be.

Redesigning procurement

Gardner: Do you have any examples of organizations that have really jumped on the bandwagon around redesigning procurement for agility? What was it like for them, and what did they get out of it? It’s always important to be able to go and show some metrics of success when you’re trying to reinvent something.

Martinez: If you’re looking for an example, you’ve got Zara, the global retailing chain. Zara changes their product constantly. They’re known for their efficient supply chains. They have some in-house manufacturing, and that in-house manufacturing gets done by them, but it’s for the basic product, the high volume, where lean manufacturing is important, because the variability is low and the volume is high.

When you get to things like the trend of the minute, be it gold buttons, asymmetrical hemlines, or something like that, they’re using a network of third parties to do that. In those cases, the volume is low, the variability is high, and so they create and disassemble these networks.

Whether financial services companies realize it or not, there’s a lot of agility built into that. There are some firms, some third parties, that a financial services firm will use to get those shareholder reports out. They send them the monthly reports, and the companies have very high volume, very excellent quality controls. Post offices are on-site. They don’t even truck it to the post office; the post office is sitting right there, and the mailings go out.

When you need to do something, for example a special mailing on a particular fund or shareholder meetings that might only be held once every couple of years, you find yourself in a situation where those kinds of networks don’t serve you very well, and you have to kind of assemble and disassemble temporary networks.

Gardner: We hear a lot these days, with services organizations in particular, that finding labor and skills is a big issue for them. It seems to me that when we look at some of the tools that procurement is using, and the role that procurement is playing, that perhaps there is some more synergy between procurement and human resources management than we have seen in the past.

Do you see that as a potential benefit when you’re looking for agility and procurement, that they should be working hand-in-hand, perhaps using some of the same platforms and methods of procurement and human capital management (HCM)?

Martinez: HCM is an important organization for procurement to bond with. Often, in a company, there’s a lot of technology and human resources (HR) spend, and not a lot of professional third parties on the use of that spend.

There consultants who can advise you on insurance policies, but they’re not always using the best tools to go out and find those providers. Sometimes, there are relationships, payments, rebates, and that sort of thing that are in play that the HR community might not be aware of or asking about.

In HR, legal, and some of the other parts of a company that often use services, there are technology solutions that are coming in place. So, if you’ve got a procurement specialist working with HR who knows a lot about recruiters and doing deals with recruiters, they had better be learning how to do a deal with LinkedIn. They had better be able to understand that those traditional service providers are not going to be needed any longer.

Procurement advice

Gardner: What advice would you give procurement professionals who are interested in redesigning their procurement for agility? Maybe they haven’t begun that journey fully. What would you advise them as important opening position steps or thinking?

Martinez: Two things. Number one, there’s no reason for your organization to call you up one day and say, “You can do this differently.” You have to be self-motivated and you have to recognize that the change has to occur, do-it-yourself. I was going to say to ask forgiveness not permission, but you’re not going to have to ask forgiveness, because you’re going to find lots of good things.

The other thing is that there are supply chains embedded all through organizations, even when no one in the organization has heard the term “supply chain.”

Procurement organizations have to think about making sure that someone in their group understands supply chain or understands that mentality of owning something from start to finish, because as long as you’re looking at discrete little pieces, you’re not going to extract the maximum value.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or  download a copy. Sponsor: Tradeshift.

You may also be interested in:

Posted in procurement | Tagged , , , , , , | Leave a comment

How new modes of buying and evaluating goods and services disrupts business procurement — for the better

The next BriefingsDirect business innovation thought leadership discussion focuses on how new modes of buying and evaluating goods and services are disrupting business procurement.

We’ll hear now from a leading industry analyst on how machine learning, cloud services, and artificial intelligence-enabled human agents are all combining to change the way that companies can order services, buy goods, and even hire employees and contractors. This business process innovation exchange comes to you in conjunction with the Tradeshift Innovation Day held in New York on June 22, 2016.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy

To learn more about how new trends are driving innovation into invoicing and spend management, please join me in welcoming Pierre Mitchell, Chief Research Officer and Managing Director at Azul Partners, where he leads the Spend Matters Procurement research activities. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: We’re seeing an awful lot of disruption in how companies can buy and sell goods and how suppliers can reach new markets. What is causing this disruption?

Mitchell: The technology is disruptive. In the old days, a lot of procurement executives would just say, “The technology is really just enabling our existing process, it’s really just a tool to automate the processes that we’re looking to do.”

That’s starting to change. Technology is fundamentally disrupting value chains. You see what’s happening in the business-to-consumer (B2C) world and the disintermediation that’s happening. Amazon, Uber, and Airbnb are having big impacts and that’s not limited to a B2C world. Look at the impact of Amazon, Uber, Airbnb, and now someone like Tradeshift? What’s going to be the impact on the business-to-business (B2B) travel process on the supply-chain process, on freight forwarding, on the logistics? It’s going to be a major impact.

So, you can say that technology is just automating, but it’s not. It’s enabling new, much more innovative value chains, and it’s truly disruptive. I know it’s a buzzword out there, but it really is.

Go and Skills

Gardner: From what you’ve heard at Tradeshift’s recent announcements around Go and Skills, what are the factors that combine in a way that you think are quite new or something that we haven’t seen before? [See related post, ChainLink analyst on how cloud-enabled supply chain networks drive companies to better manage finances, procurement.]

Mitchell: The Skills terminology is interesting. When you look at Skills, they’re really talking about a fairly atomic or higher-level kind of business process as a service. And if you’re going to do business process as service, it’s not just having a bunch of cloud apps, because cloud apps are basically a more efficient machine tool, if you will.


Just taking an on-premises app and deploying it in the cloud is great in terms of making it more efficient for the deployment, but an empty app in an empty app. What really brings the app to deliver a business outcome, to deliver that business process, is intelligence. That intelligence is going to either come from the bottom up, based on analytics that turn information as insight, but also it’s going to come from how we take information and knowledge out of our minds and put it into that software.

That’s truly disruptive and probably the topic of our conversation of what we do with 30 percent unemployment, as the robots come to take all our jobs. But certainly, in this kind of knowledge-based area, where there is some level of repetitive tasks, the game is starting to change from on-premise apps to software-as-a-service (SaaS) apps, to moving toward the cognitive and using those apps to really deliver business outcomes.

Gardner: I agree that this has wide implications across many industries and across many facets of any particular business. Just to focus on what Tradeshift is doing with Go, what’s interesting to me is that they’re combining accessible, but pertinent, real-time streamed travel data, analyzing that in the context of a data environment. But they’re also adding human travel agents, empowering humans who are very skilled in order to present very rapid returns for fairly complex business problems.

What is it about this combination of machine and human that is pushing boundaries today?

Mitchell: I like how they went about this solution. First of all, they started with the business problem and the outcome, especially in mid-market organizations, but also for large enterprises. We want to focus on making the process of buying and traveling much easier and much more intuitive, but still obviously with some of the controls that you need to have in place.

The problem is that a lot of these processes have been very siloed across multiple places. So you have your travel and expense reports, we have our purchasing cards (P-Cards), maybe an e-procurement system here and there, or maybe an e-invoicing. So you have all these different little channels that are dealing with bits and parts of the problem, but it hasn’t really come together as one kind of seamless experience.

Seamless experience

The only way that you can make that experience seamless is to have this combination of domain expertise around the process, the software to kind of support it, and then more and more this area around cognitive and the skills and being able to empower humans to do this process better.

Probably more of the repetitive tasks that those humans were previously doing will be more bot-enabled rather than human-enabled. That’s going to happen over time, but ultimately, that frees up the humans to do higher value-added activity, rather than just these rote tasks.

Gardner: My sense is that it will start with rote, but it could very easily move up a value chain of intelligence. The other interesting thing to me is that they’re using a messaging application, which people are very familiar with, and brings it to a democratization level, where almost anyone in the organization can take part.

Furthermore, what’s interesting is the ability to act on it very rapidly. So, when you create a virtual credit card, you’re able to pay for something as rapidly as you’re able to find it. It really brings decision-making and execution down to a fundamental level of whoever in the business needs to act can act, and it removes all those middle layers. To me, that’s a fairly impressive productivity benefit.

Mitchell: What’s nice about it is that if you look at the changing workforce now, Millennials are entering the workforce. They’re highly messaging-based. So, it’s really accommodating a multichannel world. The new UI with the changing workforce is going to be messaging-based, but just because it’s quick, easy, and real-time, and it’s in a metaphor that they’re familiar with, doesn’t mean that your need for controls goes away.

The platform capabilities that Tradeshift is increasingly bringing to bear have the ability to take these little atomic levels of services around whether I do a budget check in real time, how do I take what you’re asking for and turn that information into a commodity code, a merchant code, or into being able to translate all this complexity on the back end.

That doesn’t go away. You’re just shielding the end-users from it and allowing them to work in a style that’s familiar to them. Too often, it’s been a trade-off between ease of use and high controls. If you can bring those two together, especially for this changing workforce, that’s a huge win-win.

Gardner: We hear a lot these days about the need for more productivity in our economy in general in order to create a better standard of living and increased wages and so forth. It seems to me that for many years, maybe generations, big businesses had an advantage over smaller business. They’ve been able to integrate processes, have efficiencies of scale, and buy and sell at scale.

But now, when you look at some of these technologies like Tradeshift has brought to bear, maybe mid-market and small companies will get an advantage. They can be fleet, agile, and use these services and cut their costs, while being innovative all along.

Do you share my sense that maybe this is a day and age where the smaller companies have an advantage?

Level of orchestration

Mitchell: Yes, and no. I would probably vote for the school of piranhas over the shark any day, but for those piranhas to win they have to be able to assemble with each other at will. That requires a new level of orchestration and standing up business processes to get those going, rather than what’s been available in the past.

So, taking a traditional enterprise architecture and trying to stand up these cloud-enabled, API-driven services in the cloud that are getting increasingly intelligent isn’t possible with the older technology.

I’m with you, and it does require a new class of technology to stand-up these new value chains and these business networks.

Gardner: I suppose there’s nothing really stopping even the largest companies from bringing some of these atomic services to bear inside their organizations. Yes, you have to change some processes, but it seems to me that they might not have a choice when their competition gets there first.

Mitchell: Absolutely. There is so much activity going on right now around digital supply chain and digital disruption. Look at what’s happening to the supply markets. They’re getting digitized, and the supply chains are getting digitized.

So, who were the folks who are really responsible for helping the organization tap innovation from those supply markets? Hopefully, procurement is taking a leadership role in doing that. There’s a real fork in the road here for procurement to say “Look, it’s time to help us educate our stakeholders about how these value chains are going digital. How can we tap that?”

By the way, procurement is a service provider, too, and you are only going to get so much budget. So, if you can figure out some disruptive ways to carve off stuff that makes absolutely no sense for you to be doing on an ongoing basis, you can really help automate that away, so that you can focus your time on really going deep in certain categories, in innovation projects, and really doing things are really going to make a difference.

The biggest cost in procurement is the opportunity cost of wasting your time on low-value activities, such as call-center stuff, and not really doing the true profit-center innovative kinds of things. Ultimately, you have to evolve or you’re going to die. “Stay above of the API,” some people say.

Gardner: It sure seems like we’re now in a period where procurement can rise and become an evangelist within organizations for innovation across many different dimensions of the business that could have vast savings, but also put them in a highly competitive position when they could otherwise be disrupted.

So, to the procurement people, “Go get them,” right? [See related post, ChainLink analyst on how cloud-enabled supply chain networks drive companies to better manage finances, procurement.]

Can’t do it alone

Mitchell: Absolutely. And you have to work with IT and everybody else and work with your suppliers, too. You can’t do it alone, but what’s nice is that you’re finally starting to see some better options out there — a much bigger utility belt of tools that you can use to kind of make it happen, because otherwise, it’s just not possible.

Gardner: Last point, Pierre. It seems like it’s incumbent upon organizations to get a bit more experimental. There’s such a wide variety of new services coming on board. They might not want to take a bite the whole enchilada, but do you share my opinion that being experimental, doing pilot projects, trying new things is extremely important these days?

Mitchell: Absolutely. This whole notion of self funding is that it’s just become part of the new normal. The idea is what can you actually do in the short term that can add some new incremental value, demonstrate credibility, engage your stakeholders, and in doing so, unlock getting to the next level, where now you can build upon that, or if it didn’t work, you redirect, but you need to work towards a long-term vision.

This is the part where platforms, architecture, and thinking some of the stuff through is important, so that you can do stuff in the short term and get some business results, but you want to work towards a more flexible and open architecture so that you have options. Because in procurement, and for the stakeholders, it’s all about having options and flexibility. That’s what enables agility, being able to have those options.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: Tradeshift.

You may also be interested in:

Posted in procurement | Tagged , , , , , , , , | Leave a comment

How Allegiant Air solved its PCI problem and got a whole lot better security culture, too

The next BriefingsDirect security market transformation discussion explores how airline Allegiant Air solved its payment card industry (PCI) problem — and got a whole lot better security culture to boot.

When Allegiant needed to quickly manage its compliance around the Payment Card Industry Data Security Standard, it embraced many technologies, including tokenization, but the company also adopted an improved position toward privacy methods in general.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

Here to share how security technology can lead to posture maturity — and then ultimately to cultural transformation with many business benefits — we’re joined by Chris Gullett, Director of Information Assurance at Allegiant Air in Las Vegas. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: Let’s begin at a high level. What are the major trends that are driving a need for better privacy and security, particularly when it comes to customer information, and not just for your airline, but for the airline industry in general?


Gullett: The airline industry in general has quite a bit of personally identifiable information (PII). When you think about what you have to go through to get on the plane these days, everything from your whole name, your date of birth, your address, your phone number, your flight itinerary, is all going in the record.

There is lot of information that you would rather not have in the public domain, and the airline has to protect that. In fact, there have been a couple of data breaches involving major airlines with things like frequent-flyer programs. So, we have to look carefully at how we interact with our customers and make sure that data is incredibly safe. We just don’t want to take the brand hit that would occur if data leaked out.

Gardner: At the same time, we’re enjoying much better benefits by attaching more data to transactions, to process; we’re able to cross organizational boundaries. And so, the user-experience benefits of having more data are huge. We don’t want to back off from that, but we do want to be able to make sure that that data is protected.

What are some of the major ways we can recognize the need for better data uses, but keep it protected? Can they be balanced?

Technology fronts

Gullett: The airline industry is moving forward on a lot of technology fronts. Some airlines, for example, are using mobile devices to welcome specific customers on board with a complete history of how good a customer they are to that particular airline, so they can provide additional services in the air.

Other airlines are using beaconing [location] technologies, which I think is kind of cool. If you have a mobile app on your phone for the airline and you’re transiting through the airport, how cool is it to know where you are and how long it’s taking you to get through security. So, the airline might adapt at the gate as to whether there are going to be problems or not in boarding that particular plane.

Learn More About Safeguarding
Data Throughout Its Lifecycle
Read the full Report

There are a lot of different data points that are being collected and used now with different airlines handling them in different ways. In any event, the need for privacy is important, especially in the European Union (EU), which has incredibly tight data-privacy protection laws.

Gardner: We’ve talked about that on this podcast series. Now, the answer isn’t just the old thinking around security, where we’ll just wall it off, or we’ll use as little data as possible. Instead, we need to have more data in more places — even down at that mobile edge.

So, as we think about ways to accommodate our need for more data in more places, even everywhere, is there top-level thinking that goes along with being able to make the data private, but also usable?

Gullett: That’s the balancing point. Everybody wants their data everywhere. Before, a data center protected data inside the tight little confined, hardened shell you used to have, a perimeter with a firewall, and things like that. But we need data out to the edge where it’s actually being consumed; that’s what has to happen these days.

Some airlines are putting consumer PII right in hands of the flight attendant on the plane. At Allegiant, for example, we’re using mobile devices to accept credit cards on the plane. We’re experimenting with a number of different technologies that fall into a category of Internet of Things (IoT), when you think about them. What they all have in common is that they’re outside any possible perimeter.

So, you have to find a way to make every device have its own individual perimeter, and harden the data, harden the device, or some combination of the two.

Gardner: Let’s hear more about your particular airline. Tell us about Allegiant Air and what makes it unique in the airline industry.

Regular profitability

Gullett: At Allegiant, we’re up to 54 consecutive quarters of profit, which is unheard of in the airline industry. The famous phrase about the airline industry is, “How do you become a millionaire? You start with a billion dollars and you buy an airline.”

The profitability of airlines has been much in the news over the last couple of decades, because it’s cyclical. Airlines fail, go into bankruptcy, or consolidate. There’s been a lot of consolidation in the United States, with United taking on Continental, and Delta taking on Northwest as examples. Southwest taking on AirTran is another. Everybody has been in the game.

Allegiant is kind of off on its own. We’ve found an interesting niche that has very little direct competition on the routes that we serve, and that is taking vacationers to their favorite vacation destinations.

We connect small- and medium-sized markets — markets like Kalispell, Montana or Indianapolis, Indiana, a medium-sized city. We’ll take them to Florida, Las Vegas, or Los Angeles. We have about 19 vacation destinations now. We have about 115 cities overall. In fact, we serve more cities than Southwest, if you want to get a comparison on the size of the route map. And we’re also taking the charter operators to three different countries in the Caribbean.

We have quite a different footprint. That adds up to about $1.3 billion in revenue a year, and from a profitability standpoint, Allegiant is regularly recognized as one of the most profitable airlines in the world.

Gardner: It sounds like most of your passengers, perhaps even all of them, are vacationers, not business travelers. Does that change anything when it comes to user experience, privacy, and data security?

Gullett: It doesn’t change anything as far as the need to protect the data, but it puts a greater risk of brand problems concerning data breaches.

Consider the fact that our average customer flies with us once or twice a year. They are, in many cases, flying Allegiant, rather than driving to their vacation destination. Or maybe they’re taking a vacation they wouldn’t have otherwise because of Allegiant’s low prices.

So what you have is “not-frequent travelers.” In fact, that would be kind of a name. If we were going to have a frequent-flyer program it would be the “not-frequent-flyer program,” because vacationing people just don’t fly as frequently.

If I’m a business traveler, I am on so-and-so [airline], and they had a breach, I’m going to continue to fly them because I have marvelous status with their frequent-flyer program. Allegiant customers say, “Gee, I’m a little concerned about that and if they have a data breach, I think I’ll drive instead.”

So the brand damage from a breach, I believe, is higher for our airline than some of the other airlines out there.

Everyone’s responsibility

Gardner: Given how important it is to your business, to your brand, how do you rationalize these approaches to security to the larger organization? I know that’s probably not as prominent a problem as it used to be, because we can see directly the business implications of security issues. But how do you make security everybody’s responsibility? Is that something that you have been trying to do?

Gullett: First, we’re very lucky at Allegiant to have incredibly broad support from the C-suite level and the board of directors for our security program. That’s not a benefit that every company has, but we do, and it certainly makes life easier in developing the procedures and processes, and the technologies, necessary to protect our customer data.

We came into the business at Allegiant with the idea that we have the typical triad of people, process, and technology to deal with in the information security program — the three legs on a stool. If you miss one of those, you are going to be on your butt on the ground because the stool isn’t going to work very well.

We focused on technology and process early on, because those were the easy things. Those were the low-hanging fruit. We’ve really moved into more of a stage of being people-focused now. In fact, much of our budgetary spend is on security awareness for our people.

We really had to look at how we best introduce security awareness to the entire company, and to make the company more culturally sensitive to information security. That extends from the customer service agent who’s checking you in at the ticket counter all the way up to the board of directors.

The [security leadership] has certainly chimed in and made our board more aware of problems concerning information security. Recently U.S. Senator Edward Markey (D-Massachusetts) has also introduced legislation that specifically targets cyber security in the United States domestic airline industry.

That need to protect the data has to be recognized, and the most important part of protecting the data is the people that are handling the data. Awareness is really a big part of our program now.

Gardner: How did PCI-compliance form a trigger for your organization? What did that change mean for you, and maybe you could explain how you have gone about it at the process, people, and technology levels?

Compliance requirements

Gullett: Well, god bless compliance, because I think I got my first information-security job thanks to an auditor telling someone that they needed an information security guy because of Sarbanes-Oxley. And I joined Allegiant because of PCI. These various compliance regulations have certainly done wonders for the job market in information security. I can only imagine what it’s like with the data security and the EU General Data Protection Regulation (GDPR).

But, in regards to our travel into the world of PCI, Allegiant is also a unique airline in that the software that runs through the airline, the applications that run the airline, are proprietary. We actually write that ourselves. We have a large development staff and every aspect of the operation of the airline is run by custom software that we control and we write.

There are a lot of benefits to that because it allows us to be very agile and flexible if we want to make changes, but there is a downside. Some of the code dates back to the green screen days of the 1990s, and that code was going to be very difficult to bring into compliance from a PCI standpoint. It was just not written with security in mind, and while it wasn’t directly handling credit-card data, it was in the process scope.

Learn More About Safeguarding
Data Throughout Its Lifecycle
Read the full Report

A big concern was how we were going to ever bring a significantly non-compliant custom app that would take a great number of application-developer hours to bring it up to snuff and still meet a relatively tight schedule for becoming PCI-compliant. And so, at the time we looked at a number of different products out there and we thought, well, we can’t solve every problem right now. So let’s bite off small chunks and we’ll take care of that.

The first thing that looked like it would be fairly easy to do, or at least straightforward from a technology standpoint, was tokenization. And so, our search was, how can we tokenize the cards that we are storing. And that led us to stateless tokenization. We compared a number of different products, but we looked at HPE [Secure] Stateless Tokenization, and that was ultimately our choice for tokenization.

Interestingly enough, while we were on our search for what the best tokenization product was, I happened to read a press release on a website that talked about format-preserving encryption as a new technology that was going to become available — and that actually became HPE SecureData Web. We found that by accident; it wasn’t even a product that was available at the time. It was going to be targeted at card acquirers, and we actually had a hard time convincing the sales folks to sell it to us as a different type of end-user.

That solved our application problem because it allowed us to encrypt the data that was passing through those legacy apps. Between the tokenization and the format-preserving encryption (FPE) SecureData Web product, we were able to dramatically reduce the overall scope of PCI data, and that finally led us to become compliant.

Gardner: Now, this sounds like, with custom apps, it could take months, even quarters. How much time did it take you, and how important was that to you?

Gullett: The time to implement any application that is outside of what we develop ourselves is always a concern, because that takes our developers, who now have to serve as integrators, off of projects that might lead to higher revenues for the airline or to solve a problem or offer a feature that the airline would like to do. And we’re very focused on improving the overall business.

We found that the overall implementation of the HPE products was very efficient. In fact, I think we had one-and-a-half full-time equivalent (FTE) application developers on the project. It took them about three months, and that was integrating with multiple payment-card interfaces. I think we started at the end of October and we went live at the end January. So it was pretty lightweight from the standpoint of integrating significant products into our ecosystem.

Stateless tokenization

Gardner: Secure stateless tokenization can often take organizations like yours out of the business of storing credit card information at all. You’re basically passing it through and using various technologies to avoid being in a position where you could have a privacy problem. Was that the case with you, and did you extend that to other types of data?

Gullett: That was one of the marvelous parts of bringing the system online as it did take us from storing many, many millions of credit card numbers down to absolutely zero. We store no payment card numbers at this time. Everything is tokenized. The card data comes into our internal payment process and the system can send it off to the card acquirer to determine whether it should be approved or denied, and it’s immediately tokenized. So that has been a real win for the company — just much less to worry about from the card standpoint.

Now from the standpoint of how we can encrypt or protect other data, we’re looking at a number of possible scenarios now that we have gotten past the PCI hurdle. For example, while we don’t fly internationally with scheduled service, we do handle the charters for other companies. At some point, the company may well fly to international locations, and we will be collecting passport numbers. That would be the kind of thing we would also look at, in effect using some type of format preserving encryption, so that we’re not storing the actual data.

We’ve gained a lot of experience with the product over the last three years and that’s going to be a fairly easy implementation that will offer a great deal of protection. But we can also extend that out to customer names, birth dates, and all kinds of different things and we are looking at that now.

Gardner: The HPE SecureData Web and the Page-Integrated Encryption are being used by a lot of folks for the webpage, of course, the browser-based apps, but that also can provide a secure way to go to mobile. Many people are interested in the mobile web, not necessarily just native apps. Is that something you have been able to use as well? The SecureData Web as a way to get to the mobile edge securely?

Gullett: We do use SecureData Web in our mobile applications. We’ve been using it since we initially integrated the product several years ago. In fact, that was one of the data points that we had to protect from Day One. So we have the app going out to the Internet, grabbing the one-time encryption key and encrypting that data in the application itself on the mobile device, on the Android device, the Apple device, and then sending that encrypted data back to our payment-processing system, passing through any systems in the middle as an encrypted form.

We also have a subsidiary that it is not directly airline-related that is also developing a payment-processing app for the business space it works within. Because they’re developing a true native application for iOS, they’re going to be developing with the SecureData Web SDK that’s been released for mobile devices, which will certainly be much easier.

Gardner: Chris, we hear a lot of times that security is a cost center, that people don’t necessarily see it as a way of bolstering business value or growing revenue streams. It sounds like when you can employ some of these technologies, create a better posture, it frees you up, it makes you able to innovate and transform. Has that been the case with you? Can you point to any ways in which you’ve actually been able to increase revenue? I know that for airlines it’s a fairly tight margin on the travel, but some of those ancillary services can be a make or break; is that the case here?

Unbundled travel

Gullett: Allegiant is a leader in what we call unbundled travel; we would rather sell you exactly what you want. When an airline says that they offer free bags, for example, they’re not offering you free bags. It does cost to put those bags in the hold, to put those bags in the overhead and carry those bags on the plane with you. There is weight, and then that costs fuel. So, there is an expense associated with every aspect of your travel on an airline today; that’s just the way it is.

Allegiant’s unbundled services allow us to say to a traveler, “Well, sure, if you want to get on the plane and you want to bring something and put it under the seat, we’ll sell you a seat on the plane. If you want to bring 40 pounds of baggage to put in the hold, we’ll charge for that,” because not everybody wants to bring a 40-pound bag to put in the hold.

The thing about Allegiant with its proprietary application that runs the airline is that if we see an opportunity to offer a new service to the customer or a new ancillary service to the customer, we don’t have to go to a third-party and say, would you please add this so we can offer this feature to the customer; we can just do it.

At the time, we were worrying about PCI compliance and how we were going to accomplish PCI compliance, we also had a project to begin charging for carry-on bags, the bags that go up in the overhead. We could either spend a lot of time retrofitting the legacy app for PCI or we could spend time generating revenue by offering this new feature to the customer that they would be charged for carry-on bags up in the overhead.

The seats on the plane, everything associated with the airline, have a very quick expiration date. When the plane takes off, an empty seat has no value and it will have no value ever again. When a seat takes off empty, we can’t sell that person a Coke, we can’t sell them a bag, we can’t sell them a [rental] car, we can’t sell them a hotel room; that’s gone forever. So, speed to market is incredibly important for the airline industry and it may be more important for Allegiant.

In the case of our travails on PCI and how we were going to solve our PCI-compliance issue, we wanted to be able to add this feature to charge for carry-on bags. So now you have a choice. Do you spend a lot of time integrating and cleaning up legacy apps for PCI? Do you move ahead with something that could bring in millions of dollars in revenue? The answer, of course is that you have to be compliant with PCI. So, we have to do that first.

Learn More About Safeguarding
Data Throughout Its Lifecycle
Read the full Report

The fact that we were able to implement the necessary controls with the HPE products in about three months, with about one-and-a-half FTEs, meant that other application developers could spend time on that carry-on bag feature in our software, allowing us to go to market with that sooner than we would have otherwise.

Now, if you look at the fact that we went to market three months earlier than we would have normally, if we had spent three months of stopping everything to do nothing but PCI compliance. Instead, we were able to use that time to develop carry-on bag charging services, that is millions of dollars that would never have been captured in any other way, because it expires, it’s gone. Once the plane leaves the ground, you can’t charge anymore.

So there was a real delivery to the bottom line as far as a profitable feature was concerned by being able to roll out that carry-on bags feature sooner. We had a much easier, quicker, and lower resource-intensity standpoint ability to integrate, using the HPE Security products.

Where next?

Gardner: So going back to our opening sentiment around the fact that you can’t just wall off data, meaning the more data, the better for your business and the more places that data can get to, the better. You’ve demonstrated that that’s also core to business innovation, such as growing revenue in new ways, and being agile and adaptive to very competitive markets. That’s a very interesting example.

Before we sign off, Chris, where do you go next? How do you think your security steps so far have enabled you to be more fleet, more agile, and perhaps find other business benefits?

Gullett: There is no substitute for delivering innovative solutions to problems that are well-known throughout the business, and helping that to build your credibility with the executives and the board of directors. Certainly, the solution to our PCI-compliance issues, which did get a lot of exposure to the company’s executives and the board, by being able to solve that quickly and without an impact to the operations of the airline, that brought information security awareness to a level that we had not previously enjoyed at the airline.

Although, if you talk to our executives and our board, they’re going to tell you information security is very important, and I believe they believe that. The fact that you can demonstrate that you can deliver solutions that don’t break the bank and do what they say they do, means a lot.

Going back to that three-legged stool, technology and the HPE Security products that we implemented for PCI are just one part. For example, if the folks aren’t handling the credit cards properly or if they’re not adequately protecting the data that they have on their mobile devices out in the field, our risk is just as great as a credit-card data breach would have been before we had implemented the tokenization. These are all things we kind of worry about.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: Hewlett Packard Enterprise.

You may also be interested in:

Posted in big data, Security | Tagged , , , , , , , , , , , | Leave a comment

ChainLink analyst on how cloud-enabled supply chain networks drive companies to better manage finances, procurement

The next BriefingsDirect business innovation thought leadership discussion focuses on how companies are exploiting advances in procurement and finance services to produce new types of productivity benefits.

We’ll now hear from a leading industry analyst on how more data, process integration, and analysis efficiencies of cloud computing are helping companies to better manage their finances in tighter collaboration with procurement and supply-chain networks. This business-process innovation exchange comes in conjunction with the Tradeshift Innovation Day held in New York on June 22, 2016.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or  download a copy. 

To learn more about how new trends are driving innovation into invoicing and spend management, we’re joined by Bill McBeath, Chief Research Officer at ChainLink Research in Newton, Mass. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: What’s going on in terms of disruption across organizations that are looking to do better things with their procurement-to-payment processes? What is it that’s going on, that’s focusing them more on this? Why is the status quo no longer acceptable?

McBeath: There are a couple things. There is this longer-term trends toward digitization, moving away from paper and manual processes. That’s nothing new, but having said that, when we do research we always see these huge percentage of companies that are still either on paper or even more common is a mix. They have some portion of their stuff on paper and another portion that’s automated. That’s foundational and still in the process.


A big part of that is getting the long tail of suppliers on board. The large suppliers have the internal resources and that they can get hooked up with these networks and systems to get automated. Smaller suppliers, we think about people that may have less than 100 people or even mid-sized suppliers, have no dedicated IT resources. They may have a very limited ability to do these things.

That’s where the challenge is and that’s where we see some of the innovations in helping lower the barriers for them. It’s helping get a company that’s trying to automate all of their invoices or other things — that can be a mix of paper, fax, e-mail, and EDI documents — and then gradually move that customer base over to some sort of automation, whether it’s through a portal or starting to directly integrate their systems.

So that ability to get that long tail into, so that everything comes in digitally ultimately, is one of the things we’re seeing.

Common denominator

Gardner: In order to get digital, as you put it, it seems like we need a common-denominator environment that all the players — the suppliers, the buyers, the partners — can play in. It can’t be too confining, but it can be too loosey-goosey and insecure either. Have we found that balance between the right level of platform that’s suitable for these processes but that doesn’t stifle innovation and doesn’t push people away because of rigid rules?

McBeath: I want to make a couple points on that. One is about the network approach, versus the portal approach. They are distinctive approaches. In the portal approach, each buyer will set up their own portal and that’s how they’ll try to get that long tail in. The problem for the suppliers is that if they have dozens or hundreds of customers, they now have dozens or hundreds of portals to deal with.

The network is supposed to solve that problem with a network of buyers and suppliers. If you have a supplier who has multiple buyers on the network, they just have to integrate once to the network. That’s the theory, and it helps, but the problem there is that there are also lots of networks.

No one has cracked the nut yet, from the supplier’s point of view, on how not to deal with all these multiple technologies. There are a couple companies out there that are trying to build this supplier capability to just integrate once into one network and then it goes out and gets all the other networks. So, people are trying to solve that problem.

Gardner: And we have seen this before with for example. We have an environment to develop on, trying to provide services that people would use in the customer relationship management (CRM) space, for example. We saw in June that Tradeshift has come out with an app store. Is this what you are getting at? Do you think the app store model with a development element to it is an important step in the right direction?

McBeath: I mentioned there were two points. The network point was one point, and the second one is exactly what you’re talking about, which is that you may have a network, but it’s still constrained to just that solution provider’s functionality.

The or Tradeshift approach is different. It’s not just a set of APIs to integrate to their application; it’s really a full development kit, so that you can build applications on top of that.

There’s a bit of a fuzzy line there, but there are definitely things you can point to. There are enough APIs that you can write an application from scratch. That’s question number one. Does that include UI integration? That would be the second question I would ask, so that when you develop using their UI  APIs and UI guidelines, it actually looks as fully integrated as if it was one application.

There’s also a philosophy point of view. More and more large-solution providers are kind of in the “light bulb is going out” [stage] and they can’t necessarily build it all. Everyone has had partners. So, there’s nothing new about partnering and having ISV partners and integrating, but it’s a wholesale shift to building a whole toolkit, promoting it, and making it easy, and then trying to get others to build those pieces. That’s a different kind of approach.

Gardner: So clearly, a critical mass is necessary to attract enough suppliers that then attracts the buyers, that then attracts more development, and so on. What’s an important element to bring to that critical mass capability? I’m thinking about data analytics as one, mobile enablement, and security. What’s the short list of critical factors that you think these network and platform approaches need to have in order to reach critical mass?

Critical mass

McBeath: I would separate it into technology and industry-focused things, and I’ll cover the second one first. Supplier communities, especially for direct materials, tend to cluster around industries. What I see for these networks is that they can potentially meet critical mass within a specific industry by focusing on the industry. So, you get more buyers in the industry, more suppliers in the industry, and now it becomes almost the de facto way to do business within that industry.

Related to that, there are sometimes very industry-specific capabilities that are needed on the platform. It could be regulated industries like pharma or chemical that have certain things they have to do that are different from other industries. Or it could be aerospace defense, which has super-high security requirements. They may look for all of these robust identity-management capabilities.

That would be one aspect of building up a critical mass within an industry. Indirect is a little more of a horizontal play; indirect suppliers tend to go more across industries. In that case, it can be just the aggregate size of the marketplace, but it can also be the capabilities that are built in.

One interesting part of this is the supplier’s perspective, and for some of these networks, what they offer to suppliers is basically a platform to get noticed and to transact. But some companies are trying to provide more value to suppliers, not just in terms of how they market themselves, but then also outward-facing supply-chain and logistics capabilities. They’re building rich capabilities that suppliers might actually be willing to pay for, instead of just paying for the honor of transaction on a platform.

Gardner: Suffice to say things are changing rapidly in the pay-to-procure space. What advice would you give both buyers and sellers, suppliers, when it comes to looking at the landscape and trying to make evaluations and making good decisions about being on the leading edge of disruption, taking advantage of it, rather than being perhaps injured or negatively impacted by it?

McBeath: That can be a challenging question. Eventually, the winners become quite obvious when it comes to network space, because certain networks, as I mentioned, will dominate within an industry. Then, it becomes somewhat easy decision.

Before that happens, you’re trying to figure out if you’re going to bet on the right horse. Part of that is looking at the kind of capabilities on the platform. One of them that’s important, going back to this API extensibility thing, is that it’s very difficult for one platform to do it all.

So, you’d look at whether they can do 80 percent of what you need. But then, do they also provide the tools for the other 20 percent, especially if that 20 percent, even though it may be a small amount of functionality, it may be very critical functionality for your business that you really can’t live without or get high value from? If it has the ability for you to build that yourself, so that you can really get the value, that’s always a good thing.

Gardner: It sounds like it would be a good idea to try a lot of things on, see what you can do in terms of that innovation at the platform level, look at the portal approach, and see what works best for you. We’ve heard many times that each company is, in fact, quite different, and each business grouping and ecosystem is different.

Getting the long tail

McBeath: There’s a supplier perspective, and there is a buyer perspective. Besides your trading partners on the platform, from a buyer’s perspective, one of the things we talked about is getting that long tail.

Buyers should be looking at, and interested in, what level of effort it takes to onboard a new supplier, how automated can that be, and then how attractive is it to the supplier. You can ask or tell your suppliers to get on board. But if it’s really hard to do, if it’s expensive for them, if it takes a lot of time, then it’s going to be like pulling teeth. Whereas, if there are benefits for the suppliers, it’s easy to do, and it’s actually helping them, this becomes much easier to get that long tail of suppliers onboard.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or  download a copy. Sponsor: Tradeshift.

You may also be interested in:

Posted in Business networks, procurement | Tagged , , , , , , | Leave a comment

How European GDPR compliance enables enterprises to both gain data privacy and improve their bottom lines

The next BriefingsDirect security market transformation discussion focuses on the implications of the European Parliament’s recent approval of the General Data Protection Regulation or GDPR.

This sweeping April 2016 law establishes a fundamental right to personal data protection for European Union (EU) citizens. It gives enterprises that hold personal data on any of these people just two years to reach privacy compliance — or face stiff financial penalties.

But while organizations must work quickly to comply with GDPR, the strategic benefits of doing so could stretch far beyond data-privacy issues alone. Attaining a far stronger general security posture — one that also provides a business competitive advantage — may well be the more impactful implication.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

We’ve assembled a panel of cybersecurity and legal experts to explore the new EU data privacy regulation and discuss ways that companies can begin to extend these needed compliance measures into essential business benefits.

Here to help us sort through the practical path of working within the requirements of a single digital market for the EU are: Tim Grieveson, Chief Cyber and Security Strategist, Enterprise Security Products EMEA, at Hewlett Packard Enterprise (HPE); David Kemp, EMEA Specialist Business Consultant at HPE, and Stewart Room, Global Head of Cybersecurity and Data Protection at PwC Legal. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: Tim, the GDPR could mean significant financial penalties in less than two years if organizations don’t protect all of their targeted data. But how can large organizations look at this under a larger umbrella, perhaps looking at this as a way of improving their own security posture?

Grieveson: It’s a great opportunity for organizations to take a step back and review the handling of personal information and security as a whole. Historically, security has been about locking things down and saying no.


We need to break that mold. But, this is an opportunity, because it’s pan-European, to take a step back, look at the controls that we have in place, look at the people, look at the technology holistically, and look at identifying opportunities where we can help to drive new revenues for the organization, but doing it in a safe and secure manner.

Gardner: David, is there much difference between privacy and security? If one has to comply with a regulation, doesn’t that also give them the ability to better master and control their own internal destiny when it comes to digital assets?

Kemp: Well, that’s precisely what a major European insurance company headquartered in London said to us the other day. They regard GDPR as a catalyst for their own organization to appreciate that the records management at the heart of their organization is chaotic. Furthermore, what they’re looking at, hopefully with guidance from PwC Legal, is for us to provide them with an ability to enforce the policy of GDPR, but expand this out further into a major records-management facility.

Gardner: And Stewart, wouldn’t your own legal requirements for any number of reasons be bolstered by having this better management and privacy capability?

The Changing Face of Risk
Protect Your Digital Enterprise
Watch the Video to Get Started

Room: The GDPR obviously is a legal regime. So it’s going to make the legal focus much, much greater in organizations. The idea that the GDPR can be a catalyst for wider business-enabling change must be right. There are a lot of people we see on the client side who have been waiting for the big story, to get over the silos, to develop more holistic treatment for data and security. This is just going to be great — regardless of the legal components — for businesses that want to approach it with the right kind of mindset.

Kemp: Just to complement that is a recognition that I heard the other day, which was of a corporate client saying, “I get it. If we could install a facility that would help us with this particular regulation, to a certain extent relying once again on external counsel to assist us, we could almost feed any other regulation into the same engine.”


That is very material in term of getting sponsorship, buy in, interest from the front of the business, because this isn’t a facility just simply for this one, particular type of regulation. There’s so much more that could be engaged on.

Room: The important part, though, is that it’s a cultural shift, a mindset. It’s not a box-ticking exercise. It’s absolutely an opportunity, if you think of it in that mindset, of looking holistically. You can really maximize the opportunities that are out there.

Gardner: And because we have a global audience for our discussion, I think that this might be the point on the arrow for a much larger market than the EU. Let’s learn about what this entails, because not everyone is familiar with it yet. So in a nutshell, what does this new law require large companies to do? Tim, would you like to take that?

Protecting information

Grieveson: It’s ultimately about protecting European citizens’ private and personal information. The legislation gives some guidance around how to protect data. It talks about encryption and anonymization of the information, should that inevitable breach happen, but it also talks about how to enable a quicker response for a breach.

To go back to David’s point earlier on, the key part of this is really around records management. It’s understanding what information you have where and classifying that information. What you need to do with it is key to this, ultimately because of the bad guys out there. In my world as an ex-CIO and as an ex-CISO, I was always looking to try and protect myself from the bad guys who were changing their process to monetize.

They’re ultimately out to steal something, whether it be credit card information, personal information, or intellectual property (IP). Organizations often don’t understand what information they have where or who owns it, and quite often, they don’t actually value that data. So, this is a great approach to help them do that.

Gardner: And what happens if they don’t comply? This is a fairly stiff penalty.

Grieveson: It is. Up to four percent of the parent company’s annual revenue is exposed as part of a fine, but also there’s a mandatory breach notification, where companies need to inform the authorities within 72 hours of a breach.

If we think of the Ponemon Report, the average time that the bad guy is inside an organization is 243 days, so clearly that’s going to be challenge for lots of organizations who don’t know they have been breached, but also that remediation afterwards once that inevitable breach happens, on average, globally, is anywhere from 40 to 47 days.

We’re seeing that trend going in the wrong direction. We’re seeing it getting more expensive. On average, a breach costs in excess of US$7.7 million, but we are also seeing the time to remediate going up.

This is what I talked about with this cultural change in thinking. We need to get much smarter about understanding the data we have and, when we have that inevitable breach, protecting the data.

Gardner: Stewart, how does this affect companies that might not just be based in the EU countries, companies that deal with any customers, or supply chain partners, alliances, the ecosystem. Give us a sense of the concentric circles of impact that this pertains to inside the EU and beyond?

Room: Yes, the law has global effect. It’s not about just regulating European activities or protecting or controlling European data. The way it works is that any entity or data controller that’s outside of Europe and that targets Europe for goods and services will be directly regulated. It doesn’t need to have an establishment, a physical presence, in Europe. It targets the goods and services. Or, if that entity pre-files and tracks the activity of European citizens on the web, they’re regulated as well. So, there are entities that are physically not in Europe.

Any entity outside of Europe that receives European data or data from Europe for data processing is regulated as well. Then, any entity that’s outside of Europe that exports data into Europe is going to be regulated as well.

So it has global effect. It’s not about the physical boundaries of Europe or the presence only of data in Europe. It’s whether there is an effect on Europe or an effect on European people’s data.

Fringes of the EU

Kemp: If I could add to that, the other point is about those on the fringes of the EU, because that is where this is originating from, places such as Norway and Switzerland, and even South Africa, with the POPI legislation. These countries are not part of the EU, but as Stewart was saying, because a lot of their trade is going through the EU, they’re adopting local regulation in order to mirror it in order to provide a level playing field for their corporate.

Gardner: And this notion of a fundamental right to personal data protection, is that something new? Is that a departure and does that vary greatly from country to country or region to region?

Room: This is not a new concept. The European data-protection law was first promulgated in the late 1960s. So, that’s when it was all invented. And the first European legislative instruments about data privacy were in 1973 and 1974.


We’ve had international data-protection legislation in place since 1980, with the OECD, the Council of Europe in 1981, the Data Protection Directive of 1995. So, we’re talking about stuff that is almost two generations old in terms of priority and effect.

The idea that there is a fundamental right to data protection has been articulated expressly within the EU treaties for a while now. So, it’s important that entities don’t fall into the trap of feeling that they’re dealing with something new. They’re actually doing something with a huge amount of history, and because it has a huge amount of history, both the problems and the solutions are well understood.

If the first time that you deal with data protection, you feel that this is new, you’re probably misaligned with the sophistication of those people who would scrutinize you and be critical of you. It’s been around for a long time.

Grieveson: I think it’s fair to say there is other legislation as well in certain industries that make some organizations much better prepared for dealing with what’s in the new legislation.

For example, in the finance industry, you have payment card industry (PCI) security around credit-card data. So, some companies are going to be better prepared than others, but it still gives us an opportunity as an auditor to go back and look at what you have and where it fits.

Gardner: Let’s look at this through the solution lens. One of the ways that the law apparently makes it possible for this information to leave its protected environment is if it’s properly encrypted. Is there a silver bullet here where if everything is encrypted, that solves your problem, or does that oversimplify things?

No silver bullet

Grieveson: I don’t think there is a silver bullet. Encryption is about disruption, because ultimately, as I said earlier, the bad guys are out to steal data, if I come from a cyber-attack point of view, and even the most sophisticated technologies can at some point be bypassed.

But what it does do is reduce that impact, and potentially the bad guys will go elsewhere. But remember, this isn’t just about the bad guys; it’s also about people who may have done something inadvertently in releasing the data.

Encryption has a part to play, but it’s one of the components. On top of that, you have technology around having the right people and the right process, having the data-protection officer in place, and training your business users and your customers and your suppliers.

The encryption part isn’t the only component, but it’s one of the tools in your kit bag to help reduce the likelihood of the data actually being commoditized and monetized.

The Changing Face of Risk
Protect Your Digital Enterprise
Watch the Video to Get Started

Gardner: And this concept of the personally identifiable information (PII), how does that play a role, and should companies that haven’t been using that as an emphasis perhaps rethink the types of data and the types of identification with it?

Room: The idea of PII is known to US law. It lives inside the US legal environment, and it’s mainly constrained to a number of distinct datasets. My point is that the idea of PII is narrow.

The [EU] data-protection regime is concerned with something else, personal data. Personal data is any information relating to an identifiable living individual. When you look at how the legislation is built, it’s much, much more expansive than the idea of PII, which seems to be around name, address, Social Security number, credit-card information, things like that, into any online identifier that could be connected to an individual.

The human genome is an example of personal data. It’s important that listeners in a global sense understand the expansiveness of the idea or rather understand that the EU definition of personal data is intended to be highly, highly expansive.

Gardner: And, David Kemp, when we’re thinking about where we should focus our efforts first, is this primarily about business-to-consumer (B2C) data, is it about business to business (B2B), less so or more so, or even internally for business to employee (B2E)? Is there a way for us to segment and prioritize among these groups as to what is perhaps the most in peril of being in violation of this new regulation?

Commercial view

Kemp: It’s more a commercial view rather than a legal one. The obvious example will be B2C, where you’re dealing with a supermarket like Walmart in the US or Coop or Waitrose in Europe, for example. That is very clearly my personal information as I go to the supermarket.

Two weeks ago I was listening to the head of privacy at Statoil, the major Norwegian energy company, and they said we have no B2C, but in fact, even just the employee information we have is critical to us and we’re taking this extremely seriously as the way in which we manage that.

Of course, that means this applies to every single corporate, that it is both an internal and an external aggregation of information.

Grieveson: The interesting thing is, as digital disruption comes to all organizations and we start to see the proliferation and the tsunami of data being gathered, it becomes more of a challenge or an opportunity, depending on how you look at that. Literally, the new [business] perimeter is on your mobile phone, on your cellphone, where people are accessing cloud services.

If I use the British Airways app, for example, I’m literally accessing 18 cloud services through my mobile phone. That then, makes it a target for that data to be gathered. Do I really understand what’s being stored where? That’s where this really helps, trying to formalize what information is stored where and how it is being transacted and used.

Gardner: On another level of segmentation, is this very much different for a government, or public organization, versus a private? There might be some verticals industries like finance or health, where they’ve become accustomed to protecting data, but does this have implications for the public sector as well?

Room: Yes, the public sector is regulated by this. There’s a separate directive that’s been adopted to cover policing and law enforcement, but the public sector has been in scope for a very long time now.

Gardner: How does one go about the solution on a bit more granular level? Someone mentioned the idea of the data-protection officer. Do we have any examples or methodologies that make for a good approach to this, both at the tactical level of compliance but also at the larger strategic level of a better total data and security posture? What do we do, what’s the idea of a data-protection officer or office, and is that a first step — or how does one begin?

Compliance issue

Room: We’re stressing to entities that data [management] view. This is a compliance issue, and there are three legs to the stool. They need to understand the economic goals that they have through the use of data or from data itself. So, economically, what are they trying to do?

The second issue is the question of risk, and where does our risk appetite lie in the context of the economic issues? And then, the third is obligation. So, compliance. It’s really important that these three things be dealt with or considered at the very beginning and at the same time.

Think about the idea simply of risk management. If we were to look at risk management in isolation of an economic goal, you could easily build a technology system that doesn’t actually deliver any gain. A good example would be personalization and customer insights. There is a huge amount of risk associated with that, and if you didn’t have the economic voice within the conversation, you could easily fail to build the right kind of insight or personalization engine. So, bringing this together is really important.

Once you’ve brought those things together in the conversation, the question is what is your vision, what’s your desired end-state, what is it that you’re trying to achieve in light of those three things? Then, you build it out from there. What a lot of entities are doing is making tactical decisions absent the strategic decision. We know that, in a tactical sense, it’s incredibly important to do data mapping and data analysis.

We feel at PwC that that’s a really critical step to take, but you want to be doing that data mapping in the context of a strategic view, because it affects the order of priority and how you tackle the work. So, some non-obvious matters will become clearer than data mapping might be if you take the proper strategic view.

A specific example of that would be complaint handling. Not many people have complaint handling on the agenda — how we operate inside the call center, for instance. If people are cross, it’s probably a much more important strategic decision in the very beginning than some of the more obvious steps that you might take. Bringing those things forward and having a desired vision for a desired end-state will tell you the steps that you want to take and mold.

Gardner: Tim, this isn’t something you buy out of a box. The security implications of being able to establish that a breach has taken place in as little 72 hours sounds to me like it involves an awful lot more than a product or service. How should one approach this from the security culture perspective, and how should one start?

Grieveson: You’re absolutely right. This is not a single product or a point solution. You really have to bake it into the culture of your organization and focus not just on single solutions, but actually the end-to-end interactions between the user, the data, and the application of the data.

If you do that, what you’re starting to look at is how to build things in a safe, secure manner, but also how do you build them to enable your business to do something? There’s no point in building a data lake, for example, and gathering all this data unless you actually have from that data some insight, which is actionable and measured back to the business outcomes.

I actually don’t use the word “security” often when I am talking to customers. I’ll talk about “protection,” whether that’s protection of revenue or growing new markets. I put it into business language, rather than using technology language. I think it’s the first thing, because that puts people off.

What are you protecting?

The second thing is to understand what is it that you’re going to protect and why, where does it reside, and then stop to build the culture from the top down and also from the bottom up. It’s not just the data protection office’s problem or issue to deal with. It’s not just the CIO or the CISO, but it’s building a culture in your organization where it becomes normal everyday business. Good security is good business.

Once you’ve done that, this is not a project; it’s not do it once and forget it. It’s really around building a journey, but this is an evolving journey. It’s not just a matter of doing it, getting to the point where you have that check box to say, yes, you are complying. It’s absolutely around continuing to look at how you’re doing your business, continuing to look at your data as new markets come on or new data comes on.

You have to reassess where you are in this structure. That’s really important, but the key thing for me is that if you focus on that data and those interactions, you have less of a conversation about the technology. The technology is an enabler, but you do need a good mix of people, process, and technology to deliver good security in a data-driven organization.

Gardner: Given that this cuts across different groups within a large organization that may not have had very much interaction in the past — given that this is not just technology but process and people, as Tim mentioned — how does the relationship between HPE and PwC come together to help organization solve this? Perhaps, you can describe the alliance a bit for us.

Kemp: I’m a lawyer by profession. I very much respect our ability to collaborate with PwC, which is a global alliance [partner] of ours. On the basis of that, I regard Stewart and his very considerable department as providing a translation of the regulation into deliverables. What is it that you want me to do, what does the regulation say? It may say that you have to safeguard information. What does that entail? There are three major steps here.

One, is the external counsel guidance on what the regulation means into set of deliverables.

Secondly, a privacy audit. This has been around in terms of a cultural concept since the 1960s. Where are you already in terms of your management of PII? When that is complete, then we can introduce the technology that you might need in order to make this work. That is really where HPE comes in. That’s the sequence.

Then, if we just look very simply at the IT architecture, what’s needed? Well, as we said right at the beginning, my view is that this is under the records management coherence strategy in an organization. One of the first things is, can you connect to the sources of data around your organization, given that most entities have grown up by acquisition and not organically? Can you actually connect to and read the information where it is, wherever it is around the world, in whatever silo?

For example, Volkswagen, had a little problem in relation to diesel emissions, but one of the features there is not so much how do they defend themselves, but how do they get to the basic information in many countries as to whether a particular sales director knew about this issue or not.

Capturing data

So, connectivity is one point. The second thing is being able to capture information without moving it across borders. That’s where [data] technology, which handles the metadata of the basic components of a particular piece of digital information, [applies] and can [the data] be captured, whether it is structured or unstructured. Let’s bear in mind that when we’re talking about data, it could be audio or visual or alphanumeric. Can we bring that together and can we capture it?

Then, can we apply rules to it? If you had to say in a nutshell what is HPE doing as a collaboration with PwC, we’re doing policy enforcement. Whatever Stewart and his professional colleagues advise in relation to the deliverables, we are seeking to affect that and make that work across the organization.

That’s an easy way to describe it, even to non-technical people. So, General Counsel, Head of Compliance or Risk, they can appreciate the three steps of the legal interpretation, the privacy audit, and then the architecture. Then, second, this building up of the acquisition of information in order to be able to make sure that the standards that are set by PwC are actually being complied with.

Gardner: We’re coming up toward the end of our time, but I really wanted to get into some examples to describe what it looks like when an organization does this correctly, what the metrics of success are. How do you measure this state of compliance and attainment? Do any of you have an example of an organization that has gone through many of these paces, has acquired the right process, technology and culture, and what that looks like when you get there?

Room: There are various metrics that people have put in place, and it depends which principles you’re talking about. We obviously have security, which we’ve spoken about quite a lot here, but there are other principles: accuracy, retention, delete, transfers, and on and on.

But one of the metrics that entities are putting in, which is non-security controlled, is about the number of people who are successfully participating in training sessions and passing the little examination at the very end. The reason that key performance indicator (KPI) is important is that during enforcement cases, when things go wrong — and there are lots and lots of these cases out there — the same kind of challenges are presented by the regulators and by litigants, and that’s an example of one of them.

So, when you’re building your metrics and your KPIs, it’s important to think not just about the measures that would achieve operational privacy and operational security, but also think about the metrics that people who would be adverse to you would understand: judges, regulators, litigants, etc. There are essentially two kinds of metrics, operational results metrics, but also the judgment metrics that people may apply to you.

Gardner: At HPE, do you have any examples or perhaps you can describe why we think that doing this correctly could get you into a better competitive business position? What is it about doing this that not only allows you to be legally compliant, but also puts you in an advantageous position in a market and in terms of innovation and execution?

Biggest sanction

Kemp: If I could quote some of our clients, especially in the Nordic Region, there are about six major reasons for paying strict and urgent attention to this particular subject. One of them, listening to my clients, has to do with compliance. That is the most obvious one. That is the one that has the biggest sanction.

But there are another five arguments — I won’t go into all of them — which have to do with advancement of the business. For example, a major media company in Finland said, if we could only be able to say on our website that we were GDPR-compliant that would increase materially the customer belief in our respect for their information, and it would give us a market advantage. So it’s actually advancing the business.

The second aspect, which I anticipated, but I’ve also heard from corporations, is that in due course, if it’s not here already, there might be a case where governments would say that if you’re not GDPR compliant, then you can’t bid on our contracts.

The third might be, as Tim was referring to earlier, what if you wanted to make best use of this information? There’s even a possibility of corporations taking the PII, making sure it’s fully anonymous or pseudo-anonymized, and then mixing it with other freely available information, such as Facebook, and actually saying to a customer, David, we would like to use your PII, fully anonymized. We can prove to you that we have followed the PwC legal guidance. And furthermore, if we do use this information and use it for analytics, we might even want to pay you for this. What are you doing? You are increasing the bonding and loyalty with your customers.

So, we should think about the upsides of the business advancement, which ironically is coming out of a regulation, which may not be so obvious.

Gardner: Let’s close out with some practical hints as to how to get started, where to find more resources, both on the GDPR, but also how to attain a better data privacy capability. Any thoughts about where we go to begin the process?

Kemp: I would say that in the public domain, the EU is extremely good at promulgating information about the regulation itself coming in and providing some basic interpretation. But then, I would hand it on to Stewart in terms of what PwC Legal is already providing in the public domain.

Room: We have two accelerators that we’ve built to help entities go forward. The first is our GDPR Readiness Assessment Tool (RAT), and lots of multinationals run the RAT at the very beginning of their GDPR programs.

The Changing Face of Risk
Protect Your Digital Enterprise
Watch the Video to Get Started

What does it do? It asks 70 key questions against the two domains of operation and legal privacy. Privacy architecture and privacy principles are mapped into a maturity metric that assesses people’s confidence about where they stand. All of that is then mapped into the articles and recitals of the GDPR. Lots of our clients use the RAT.

The second accelerator is the PwC Privacy and Security Enforcement Tracker. We’ve been tracking the results of regulatory cases and litigation in this area over many years. That gives us a very granular insight into the real priorities of regulators and litigants in general.

Using those two tools at the very beginning gives you a good insight into where you are and what your risk priorities are.

Gardner: Last word to you, Tim. Any thoughts on getting started — resources, places to go to get on your journey or further along?

The whole organization

Grieveson: You need to involve the whole organization. As I said earlier on, it’s not just about passing it over to the data-protection officer. You need to have the buy-in from every part of the organization. Clearly, working with organizations who understand the GDPR and the legal implications, such as the collaboration between PwC and HPE, is where I would go.

When I was in the seat as a CISO, I’m not a legal expert, so one of the first things that I did was go and get that expertise and brought it in. Probably the first place I would start is getting buy-in from the business and making sure that you have the right people around the table to help you on the journey.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: Hewlett Packard Enterprise.

You may also be interested in:

Posted in Hewlett Packard Enterprise, HP, Security | Tagged , , , , , , , , , , | Leave a comment

CA Technologies enhances offerings to streamline cloud and hybrid IT infrastructures

New capabilities in CA Unified Infrastructure Management (CA UIM) are designed to help enterprises adopt cloud more rapidly and better manage hybrid IT infrastructure heterogeneity across several major cloud environments.

Enterprises and SMBs are now clamoring for hybrid cloud benefits, due to an ability for focus on on apps and to gain speed for new business initiatives, says Stephen Orban, Global Head of Enterprise Strategy at AWS.

“Going cloud-first allows organizations to focus on the apps that make the business run, says Orban. Using hybrid computing, the burden of proof soon shifts to why should we use cloud for more of IT,” he says.

As has been the case with legacy IT for decades, the better the overall management, the better the adoptions success, productivity, and return on investment (ROI) for IT systems and the apps they support — no matter their location of IT architecture. This same truth is now being applied to solve the cloud heterogeneity problem, just as it did the legacy platforms heterogeneity problem. The total visibility solution may be even more powerful in this new architectural era.

Cloud-fist is business-first

The stakes are now even higher. As you migrate to the cloud, one weak link in a complex hybrid cloud deployment can ruin the end-user experience, says Ali Siddiqui, general manager, Agile Operations at CA, “By providing insight across the performance of all of an organization’s IT resources in a single and unified view, CA UIM gives users the power to choose the right mix of modern cloud enablement technologies.”

CA UIM reduces complexity of hybrid infrastructures by providing visibility across on-premises, private-, and public-cloud infrastructures through a single console UI. Such insight enables users to adopt new technologies and expand monitoring configurations across existing and new IT resource elements. CA expects the solution to reduce the need for multiple monitoring tools. [Disclosure: CA is a sponsor of BriefingsDirect.]

“Keep your life simple from a monitoring and management perspective, regardless of your hybrid cloud [topology],” said Michael Morris, Senior Director Product Management, at CA Technologies in a recent webcast.

To grease the skids to hybrid cloud adoption, CA UIM now supports advanced performance monitoring of Docker containers, PureStorage arrays, Nutanix, hyperconverged systems, OpenStack cloud environments, and additional capabilities for Amazon Web Services (AWS) cloud infrastructures, CA Technologies announced last week.

CA is putting its IT systems management muscle behind the problem of migrating from data centers to the cloud, and then better supporting hybrid models, says Siddiqui. The “single pane of glass” monitoring approach that CA is delivering allows measurement and enforcement of service-level agreements (SLAs) before and after cloud migration. This way, continuity of service and IT value-add can be preserved and measured, he added.

Managing a cloud ecosystem

“Using advanced monitoring and management can significantly cut costs of moving to cloud,” says Siddiqui.

Indeed, CA is working with several prominent cloud and IT infrastructure partners to make the growing diversity of cloud implementations a positive, not a drawback. For example, “Virtualization tools are too constrained to specific hypervisors, so you need total cloud visibility,” says Steve Kaplan, Vice President of Client Strategy at Nutanix, of CA’s new offerings.

And it’s not all performance monitoring. Enhancements to CA UIM’s coverage of AWS cloud infrastructures include billing metrics and support for additional services that provide deeper actionable insights on cloud brokering.

CA UIM now also provides:

  • Service-centric and unified analytics capabilities that rapidly identify the root cause of performance issues, resulting in a faster time to repair and better end-user experience
  • Out-of-the-box support for more than 140 on-premises and cloud technologies
  • Templates for easier configuring of monitors than can be applied to groups of disparate systems

What’s more, to ensure the reliability of networks such as SDN/NFV that connect and scale hybrid environments, CA has also delivered CA Virtual Network Assurance, which provides a common view of dynamic changes across virtual and physical network stacks.

You may also be interested in:

Posted in Cloud computing, Virtualization | Tagged , , , , , , , , | Leave a comment