Big data and cloud combo spark momentous genomic medicine advances at HudsonAlpha

The next BriefingsDirect Voice of the Customer IT innovation case study explores how the HudsonAlpha Institute for Biotechnology engages in digital transformation for genomic research and healthcare paybacks.

We’ll learn how HudsonAlpha leverages modern IT infrastructure and big-data analytics to power a pioneering research project incubator and genomic medicine innovator.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

To describe new possibilities for exploiting cutting-edge IT infrastructure and big data analytics for potentially unprecedented healthcare benefits, we’re joined by Dr. Liz Worthey, Director of Software Development and Informatics at the HudsonAlpha Institute for Biotechnology in Huntsville, Alabama. The discussion is moderated by BriefingsDirect’s Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: It seems to me that genomics research and IT have a lot in common. There’s not much daylight between them — two different types of technology, but highly interdependent. Have I got that right?

Worthey: Absolutely. It used to be that the IT infrastructure was fairly far away from the clinic or the research, but now they’re so deeply intertwined that it necessitates many meetings a week between the leadership of both in order to make sure we get it right.

Gardner: And you have background in both.

Worthey: My background is primarily on the biology side, although I’m Director of Informatics and I’ve spent about 20 years working in the software-development and informatics side. I’m not IT Director, but I’m pretty IT savvy, because I’ve had to develop that skill set over the years. My undergraduate degree was in immunology, and since then, my focus has really been on genetics informatics and bioinformatics.

Gardner: Please describe what genetic informatics or genomic informatics is for our audience.

Worthey: Since 2003, when we received the first version of a human reference genome, there’s been a large field involved in the task of extracting knowledge that can be used for society and health from genomic data.

Worthey

A [human] genome is 3.2 billion nucleotides in length, and in there, there’s a lot of really useful information. There’s information about which diseases that individual may be more likely to get and which diseases they will get.

It’s also information about which drugs they should and shouldn’t take; information about which types of procedures, surveillance procedures, what colonoscopies they should have. And so, the clinical aspects of genomics are really developing the analytical capabilities to extract that data in real time so that we can use it to help an individual patient.

On top of that, there’s also a lot of research. A lot of that is in large-scale studies across hundreds of thousands of individuals to look for signals that are more difficult to extract from a single genome. Genomics, clinical genomics, is all of that together.

Parallel trajectory

Gardner: Where is the societal change potential in terms of what we can do with this information and these technologies?

Worthey: Genomics has existed for maybe 20 years, but the vast majority of that was the first step. Over the last six years, we’ve taken maybe the second or third step in a journey that’s thousands of steps long.

We’re right on the edge. We didn’t used to be able to do this, because we didn’t have any data. We didn’t have the capability to sequence a genome cheaply enough to sequence lots. We also didn’t have the storage capabilities to store that data, even if we could produce it, and we certainly didn’t have enough compute to do the analysis, infrastructure-wise. On top of that, we didn’t actually have the analytical know-how or capabilities either. All of that is really coalescing at the same time.

Start Your HPE Vertica

Community Edition Trial

As we are doing genomics, and that technology and the sequencing side has come up, the compute and the computing technologies have come up at the time. They’re feeding each other, and genomics is now driving IT to think about things in a very different way.

Gardner: Let’s dive into that a little bit. What are the hurdles technologically for getting to where you want to be, and how do you customize that or need to customize that, for your particular requirements?

Worthey: There are a number of hurdles. Certainly, there are simpler hurdles that we have to get past, like storage, storage tied with compression. How do you compress that data to where you can store millions of genomes at a price that’s affordable.

A bigger hurdle is the ability to query information at a lot of disparate sites. When we think about genomic medicine, one of the things that we really want do is share data between institutions that are geographically diverse. And the data that we want to share is millions of data points, each of which has hundreds or thousands of annotations or curations.

Those are fairly complex queries, even when you’re doing it in one site, but in order to really change the practice of medicine, we have to be able to do that regionally, nationally, and globally. So, the analytics questions there are large.

We have 3.2 billion data points for each individual. The data is quite broad, but it’s also pretty deep. One of the big problems is that we don’t have all the data that we need to do genomic medicine. There’s going to be data mining — generate the data, form a hypothesis, look at the data, see what you get, come back with a new hypothesis, and so on.

Finally, one of the problems that we have is that a lot of algorithms that you might use only exists in the brains of MDs, other clinical folks, or researchers. There is really a lot of human computer interaction work to be done, so that we can extract that knowledge.

There are lots of problems. Another big problem is that we really want to put this knowledge in the hands of the doctor while they have seven minutes to see the patient. So, it’s also delivery of answers at that point in time, and the ability to query the data by the person who is doing the analysis, which ideally will be an MD.

Cloud technology

Gardner: Interestingly, the emergence of cloud methods and technology over the past five or 10 years would address some of those issues about distributing the data effectively — and also perhaps getting actionable intelligence to a physician in an actual critical-care environment. How important is cloud to this process and what sort of infrastructure would be optimal for the types of tasks that you have in mind?

Worthey: If you had asked me that question two years ago, on the genomic medicine side, I would have said that cloud isn’t really part of the picture. It wasn’t part of the picture for anything other than business reasons. There were a lot of questions around privacy and sharing of healthcare information, and hospitals didn’t like the idea.

They’re very reluctant to move to the cloud. Over the last two years, that has started to change. Enough of them had to decide to do it, before everybody would view it as something that was permissible.

Cloud is absolutely necessary in many ways, because we have periods where lots of data that has to be computed and analytics has to be run. Then, we have periods where new information is coming off the sequencer. So, it’s that perfect crest and trough.

If you don’t have the ability to deal with that sort of fluctuation, if you buy a certain amount of hardware and you only have it available in-house, your pipeline becomes impacted by the crests and then often sits idle for a long time.

Start Your HPE Vertica

Community Edition Trial

But it’s also important to have stuff in-house, because sometimes, you want to do things in a different way. Sometimes, you want to do things in a more secure manner.

It’s kind of our poster child for many of the new technologies that are coming out that look at both of those, that allow you to run things in-house and then also allow you to run the same jobs on the same data in the cloud as well. So, it’s key.

Gardner: That brings me to the next question about this concept of genomics as a service or a platform to support genomics as a service. How do you envision that and how might that come about?

Worthey: When we think about the infrastructure to support that, it has to be something flexible and it has to be provided by organizations that are able to move rapidly, because the field is moving really quickly

It has to be infrastructure that supports this hypothesis-driven research, and it has to be infrastructure that can deal with these huge datasets. Much of the data is ordered, organized, and well-structured, but because it’s healthcare, a lot of the information that we use as part of the interpretation phase of genomic medicine is completely unstructured. There needs to be support for extraction of data from silos.

My dream is that the people who provide these technologies will also help us deal with some of these boundaries, the policy boundaries, to sharing data, because that’s what we need to do for this to become routine.

Data and policy

Gardner: We’ve seen some of that when it comes to other forms of data, perhaps in the financial sector. More and more, we’re seeing tokenization, authentication, and encryption, where data can exist for a period of time with a certain policy attached to it, and then something will happen if the data is a result for that policy. Is that what you’re referring to?

Worthey: Absolutely. It’s really interesting to come to a meeting like HPE Discover because you get to see what everybody else is doing in different fields. Much of the things that people in my field have regarded as very difficult are actually not that hard at all; they happen all the time in other industries.

A lot of this — the encryption, the encrypted data sharing, the ability to set those access controls in a particular way that only lasts for a certain amount of time for a particular set of users — seems complex, but it happens all the time in other fields. A big part of this is talking to people who have a lot of experience in a regulated environment. It’s just not this regulated environment and learning the language that they use to talk to the people that set policy there and transferring that to our policy makers and ideally getting them together to talk to one another.

Gardner: Liz, you mentioned the interest layers in getting your requirements to the technology vendors, cloud providers, and network providers. Is that under way? Is that something that’s yet to happen? Where is the synergy between the genomic research community and the technology-vendor platform provider community?

Worthey: This is happening fast. For genomics, there’s been a shift in the volume of genomic data that we can produce with some new sequencing technology that’s coming. If you’re a provider of hardware or service user solutions to deal with big data, looking at genomics, as the people here are probably going to overtake many of those other industries in terms of the volume and complexity of the data that we have.

The reason that that’s really interesting is because then you get invited to come and talk at forums, where there’s lots of technology companies and you make them aware of the work that has to be done in the field of medicine, and in genomic research, and then you can start having those discussions.

A lot of the things that those companies are already doing, the use cases, are similar and maybe need some refinement, but a lot of that capability is already there.

Gardner: It’s interesting that you’ve become sort of the “New York” of use cases. If you can make it there, you can make it anywhere. In other words, if we can solve this genomic data issue and use the cloud fruitfully to distribute and gather — and then control and monitor the data as to where it should be under what circumstances — we can do just about anything.

Correct me if I am wrong, though. We’re using data in the genomic sense for population groups. We’re winnowing those groups down into particular diseases. How farfetched is it to think about individuals having their own genomic database that would follow them like an authenticated human design? Is that completely out of the bounds? How far would that possibly be?

Technology is there

Worthey: I’ve had my genome sequenced, and it’s accessible. I could pick it up and look at it on the tools that I developed through my phone sitting here on the table. In terms of the ability to do that, a lot of that technology is already here.

The number of people that are being sequenced is increasing rapidly. We’re already using genomics to make diagnosis in patients and to understand their drug interactions. So, we are here.

One of the things that we are talking about just now is, at what point in a person’s life should you sequence their genome. I and a number of other people in the field believe that that is earlier, rather than later, before they get sick. Then, we have that information to use when they get those first symptoms. You are not waiting until they’re really ill before you do that.

I can’t imagine a future where that’s not what’s going to happen, and I don’t think that future is too far away. We’re going to see it in our lifetimes, and our children are definitely going to see it in theirs.

Gardner: The inhibitors, though, would be more of an ethical nature, not a technological nature.

Worthey: And policy, and society; the society impact of this is huge.

The data that we already have, clinical information, is really for that one person, but your genome is shared among your family, even distant relatives that you’ve never met. So, when we think about this, there are many very hard ethical questions that we have to think about. There are lots of experts that are working on that, but we can’t let that get in the way of progress. We have to do it. We just have to make sure we do it right.

Gardner: To come back down a little bit toward the technology side of things, seeing as so much progress has been made and that there is the tight relationship between information technology and some of the fantastic things that can happen with the proper knowledge around genomic information, can you describe the infrastructure you have in place? What’s working? What do you use for big-data infrastructure, and cloud or hybrid cloud as well?

Worthey: I’m not on the IT side, but I can tell you about the other side and I can talk a little bit on the IT side as well. In terms of the technologies that we use to store all of that varying information, we’re currently using Hadoop and Mongo DB. We finished our proof of concept with HPE, looking at their Vertica solution.

We have to work out what the next steps might be for our proof of concept. Certainly, we’re very interested in looking at the solutions that they have in here. They fit with our needs. The issue that’s been addressed on that side is lots of variants, complex queries, that you need to answer really fast.

Start Your HPE Vertica

Community Edition Trial

On the other side, one of the technological hurdles that we have to meet is the unstructured data. We have electronic health record (EHR) information that’s coming in. We want to hook up to those EHRs and we want to use systems to process that data to make it organized, so that we can use it for the interpretation part.

In-house solution

We developed in-house solutions that we’re using right now that allow humans to come in and look at that data and select the terms from it. So, you’d select disease terms. And then, we have in-house solutions to map them to the genomic side. We’re looking at things like HPE’s IDOL as a proof-of-concept (POC) on that side. We’re talking to some EHR companies about how to hook up the EHR to those solutions to our software to make it a seamless product and that would give us all that.

In terms of hardware, we do have HPE hardware in-house. I think we have 12 petabytes of their storage. We also have data direct network hardware, a general parallel file system solution. We even have things down to graphics processors for some of the analysis that we do. We’ve a large deck of such GPUs because in some cases it’s much faster for some other types of problems that we have to solve. So we are pretty IT-rich, a lot of heavy investment on the IT side.

Gardner: And cloud — any preference to the topology that works for you architecturally for cloud, or is that still something you are toying with?

Worthey: We’re currently looking at three different solutions that are all cloud solutions. We not only do the research and the clinical, but we also have a lab that produces lots of data for other customers, a lab that produces genomic data as a service.

They have a challenge of getting that amount of data returned to customers in a timely fashion. So, there are solutions that we’re looking at there. There are also, as we talked at the start, solutions to help us with that in-flow of the data coming off the sequencers and the compute — and so we’re looking at a number of different solutions that are cloud-based to solve some of those challenges.

Gardner: Before we close, we’ve talked about healthcare and population impacts, but I should think there’s also a commercial aspect to this. That kind of information will lend itself to entrepreneurial activities, products and services, a great demand in the marketplace? Is that something you’re involved with as well, and wouldn’t that help foot the bill for some of these many costly IT infrastructure investments?

Worthey: One of the ways that HudsonAlpha Institute was set up was just that model. We have a research, not-for-profit side, but we also have a number of affiliate companies that are for-profit, where intellectual property and ideas can go across to that site and be used to generate revenue that fund the research and keep us moving and be on the cutting-edge.

We do have a services lab that does genomic sequencing in analytics. You can order that from them. We also service a lot of people who have government contracts for this type of work. And then, we have an entity called Envision Genomics. For disclosure, I’m one of founders of that entity. It’s focused on empowering people to do genomic medicine and working with lots of different solution providers to get genomic medicine being done everywhere it’s applicable.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: Hewlett Packard Enterprise.

You may also be interested in:

Posted in big data, Cloud computing, data analysis, Hewlett Packard Enterprise, HP, HP Vertica | Tagged , , , , , , , , , , | Leave a comment

Cybersecurity crosses the chasm: How IT now looks to the cloud for best security

The next BriefingsDirect cybersecurity innovation and transformation panel discussion explores how cloud security is rapidly advancing, and how enterprises can begin to innovate and prevail over digital disruption by increasingly using cloud-defined security.

We’ll examine how a secure content collaboration services provider removes the notion of organizational boundaries so that businesses can better extend processes. And we’ll hear how less boundaries and cloud-based security together support transformative business benefits.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

To share how security technology leads to business innovations, we’re joined by Daren Glenister, Chief Technology Officer at Intralinks in Houston, and Chris Steffen, Chief Evangelist for Cloud Security at HPE. The discussion is moderated by BriefingsDirect’s Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: Daren, what are the top three trends driving your need to extend security and thereby preserve trust with your customers?

Glenister

Glenister: The top thing for us is speed of business, people being able to do business beyond boundaries, and how can they enable the business rather than just protect it. In the past, security has always been about how we shut things down and stop data. But now it’s how we do it all securely, and how we perform business outside of the organization. So, it’s enabling business.

The second thing we’ve seen is compliance. Compliance is a huge issue for most of the major corporations. You have to be able to understand where the data is and who has access to it, and to know who’s using it and make sure that they can be completely compliant.

The third thing is primarily around the shift between security inside and outside of the organization. It’s been a fundamental shift for us, and we’ve seen that security has moved from people’s trust in their own infrastructure, versus using a third-party who can provide that security and have a far higher standard, because that’s what they do the whole day, every day. That security shift from on-premise to the cloud is a third big driver for us, and we’ve seen that in the market.

Gardner: You’re in a unique position to be able to comment on this. Tell us about Intralinks, what the company does, and why security at the edge is part of your core competency.

Secure collaboration

Glenister: We’re a software-as-a-service (SaaS) provider and we provide secure collaboration for data, wherever that data is, whether it’s inside a corporation or it’s shared outside. Typically, once people share data outside, whether it’s through e-mail or any other method, some of the commercial tools out there have lost control of that data.

We have the ability to actually lock that data down, control that, and put the governance and the compliance around that to secure that data, know where the high-value intellectual property (IP) is, who has access to it, and then be able to even share as well. And, if you’re in a situation of losing data, revoke access to someone who has left the organization.

Gardner: And these are industries that have security as a paramount concern. So, we’re talking about finance and insurance. Give us a little bit more indication of the type of data we’re talking about.

Glenister: It’s anybody with high-value IP or compliance requirements — banking, finance, healthcare, life sciences, for example, and manufacturing. Even when you’re looking at manufacturing overseas and you have IP going over to China to manufacture your product, your plans are also being shared overseas. We’ve seen a lot of companies now asking how to protect those plans and therefore, protect IP.

Critical Security
And Compliance Considerations
For Hybrid Cloud Deployments

Gardner: Chris, Intralinks seems to be ahead of the curve, recognizing how cloud can be an enabler for security. We’re surely seeing a shift in the market, at least I certainly am. In the last six months or so, companies that were saying that security was a reason not to go to the cloud are now saying that security is a reason they’re going to the cloud. They can attain security better. What’s happened that has made that perspective flip?

Steffen: I don’t know exactly what’s happened, but you’re absolutely right; that flip is going on. We’ve done a lot of research recently and shown that when you’re looking at inherent barriers going to a cloud solution, security and compliance considerations are always right there at the top. We commissioned the study through 451 Research, and we kind of knew that’s what was going on, but they sure nailed it down, one and two, security and compliance, right there. [Get a copy of the report.]

Steffen

The reality, though, is that that the C-table, executives, IT managers, those types, are starting to look at the massive burden of security and hoping to find help somewhere. They can look at a provider like Intralinks, they can look at a provider like HPE and ask, “How can they help us meet our security requirements?”

They can’t just third-party their security requirements away. That’s not going to cut it with all the regulators that are out there, but we have solutions. HPE has a solution, Intralinks has solutions, a lot of third-party providers have solutions that will help the customer address some of those concerns, so those guys can actually sleep at night.

Gardner: We’re hearing so much about digital disruption in so many industries, and we’re hearing about why IT can’t wait, IT needs to be agile and have change in the business model to appeal to customers to improve their user experience.

It seems that security concerns have been a governor on that. “We can’t do this because ‘blank’ security issue arises.” It seems to me that it’s a huge benefit when you can come to them and say, “We’re going to allow you to be agile. We’re going to allow you to fight back against disruption because security can, in fact, be managed.” How far are we to converting disruption in security into an enabler when you go to the cloud?

Very difficult

Glenister: The biggest thing for most organizations is they’re large, and it’s very difficult to transform just the legacy systems and processes that are in-place. It’s very difficult for organizations to change quickly. To actually drive that, they have to look at alternatives, and that’s why a lot of people move into cloud. Driving the move to the cloud is, “Can we quickly enable the business? Can we quickly provide those solutions, rather than having to spend 18 months trying to change our process and spend millions of dollars doing it?”

Enablement of the business is actually driving the need to go to the cloud, and obviously will drive security around that. To Chris’s point a few minutes ago, not all vendors are the same. Some vendors are in the cloud and they’re not as secure as others. People are looking for trusted partners like HPE and Intralinks, and they are putting their trust and their crown jewels, in effect, with us because of that security. That’s why we work with HPE, because they have a similar philosophy around security as we do, and that’s important.

Steffen: The only thing I would add to that is that security is not only a concern of the big business or the small business; it’s everybody’s concern. It’s one of those things where you need to find a trusted provider. You need to find that provider that will not only understand the requirements that you’re looking for, but the requirements that you have.

This is my opinion, but when you’re kicking tires and looking at your overall compliance infrastructure, there’s a pretty good chance you had to have that compliance for more than a day or two. It’s something that has been iterative; it may change, it may grow, whatever.

So, when you’re looking at a partner, a lot of different providers will start to at least try to ensure that you don’t start at square-one again. You don’t want to migrate to a cloud solution and then have all the compliance work that you’ve done previously just wiped away. You want a partner that will map those controls and that really understands those controls.

Perfect examples are in the financial services industry. There are 10 or 11 regulatory bodies that some of the biggest banks in the world all have to be compliant with. It’s extremely complicated. You can’t really expect that Big Bank 123 is going to just throw away all that effort, move to whatever provider, and hope for the best. Obviously, they can’t be that way. So the key is to take a map of those controls, understand those controls, then map those controls to your new environment.

Gardner: Let’s get into a little bit of the how … How this happens. What is it that we can do with security technology, with methodologies, with organizations that allow us to go into cloud, remove this notion of a boundary around your organization and do it securely? What’s the secret sauce, Daren?

Glenister: One of the things for us, being a cloud vendor, is that we can protect data outside. We have the ability to actually embed the security into documents wherever documents go. Instead of just having the control of data at rest within the organization, we have the ability to actually control it in motion inside and outside the perimeter.

You have the ability to control that data, and if you think about sharing with third parties, quite often people say, “We can’t share with a third-party because we don’t have compliance, we don’t have a security around it.” Now, they can share, they can guarantee that the information is secure at rest, and in motion.

Typically, if you look at most organizations, they have at-rest data covered. Those systems and procedures are relative child’s play. But that’s been covered for many years. The challenge is that it’s newly in motion. How do you actually extend working with third parties and working with outside organizations?

Innovative activities

Gardner: It strikes me that we’re looking at these capabilities through the lens of security, but isn’t it also the case that this enables entirely new innovative activities. When you can control your data, when you can extend where it goes, for how long, to certain people, under certain circumstances, we’re applying policies, bringing intelligence to a document, to a piece of data, not just securing it but getting control over it and extending its usefulness. So why would companies not recognize that security-first brings larger business benefits that extend for years?

Glenister: Historically, security has always been, “No, you can’t do this, let’s stop.” If you look in a finance environment, it’s stop using thumb drives, stop using emails, stop using anything rather than ease of solution. We’ve seen a transition. Over the last six months, you’re starting to see a transition where people are saying, “How do we enable? How do we get people to control them?’ As a result of that, you see new solutions coming out from organizations and how they can impact the bottom line.

Gardner: Behavior modification has always a big part of technology adoption. Chris, what is it that we can do in the industry to show people that being secure and extending the security to wherever the data is going to go gives us much more opportunity for innovation? To me this is a huge enticing carrot that I don’t think people have perhaps fully grokked.

Steffen: Absolutely. And the reality of it is that it’s an educational process. One of the things that I’ve been doing for quite some time now is trying to educate people. I can talk with a fellow CISSP and we can talk about Diffie-Hellman encryption and I promise that your CEO does not care, and he shouldn’t. He shouldn’t ever have to care. That’s not something that he needs to care about, but he does need to understand total cost of ownership (TCO), he needs to understand return on investment (ROI). He needs to be able to go to bed at night understanding that his company is going to be okay when he wakes up in the morning and that his company is secure.

It’s an iterative process; it’s something that they have to understand. What is cloud security? What does it mean to have defense in depth? What does it mean to have a matured security policy vision? Those are things that really change the attitudinal barriers that you have at a C-table that you then have to get past.

Security practitioners, those tinfoil hat types — I classify myself as one of those people, too — truly believe that they understand how data security works and how the cloud can be secured, and they already sleep well at night. Unfortunately, they’re not the ones who are writing the checks.

It’s really about shifting that paradigm of education from the practitioner level, where they get it, up to the CIO, the CISO who hopefully understands, and then up to the C-table and the CFO making certain that they can understand and write that check to ensure that going to a cloud solution will allow them to sleep at night and allow the company to innovate. They’ll take any security as an enabler to move the business forward.

Critical Security
And Compliance Considerations
For Hybrid Cloud Deployments

Gardner: So, perhaps it’s incumbent upon IT and security personnel to start to evangelize inside their companies as to the business benefits of extended security, rather than the glass is always half empty.

Steffen: I couldn’t agree more. It’s a unique situation. Having your — again, I’ll use the term — tinfoil hat people talking to your C-table about security — they’re big and scary, and so on. But the reality of it is that it really is critically important that they do understand the value that security brings to an organization.

Going back to our original conversations, in the last 6 to 12 months, you’re starting to see that paradigm shifted a little bit, where C-table executives aren’t satisfied with check-box compliance. They want to understand what it takes to be secure, and so they have experts in house and they want to understand that. If they don’t have experts in-house, there are third-party partners out there that can provide that amount of education.

Gardner: I think it’s important for us to establish that the more secure and expert you are at security the more of a differentiator you have against your competition. You’re going to clean up in your market if you can do it better than they can.

Step back

Steffen: Absolutely, and even bring that a step further back. People have been talking for two decades now about technology as a differentiator and how you can make a technical decision or embrace and exploit technology to be the differentiator in your vertical, in your segment, so on.

The credit reporting agency that I worked for a long time ago was one of those innovators, and people thought we were nuts for doing some of the stuff that we are doing. Years later, everybody is doing the same thing now.

It really can set up those things. Security is that new frontier. If you can prove that you’re more secure than the next guy, that your customer data is more secured than the next guy, and that you’re willing to protect your customers more than the next guy, maybe it’s not something you put on a billboard, but people know.

Would you go to retailer A because they have had a credit card breach or do you decide to go retailer B? It’s not a straw man. Talk to Target, talk to Home Depot, talk to some of these big big-box stores that have had breaches and ask how their numbers looked after they had to announce that they had a breach.

Gardner: Daren, let’s go to some examples. Can think of an example of IntraLinks and a security capability that became a business differentiator or enable?

Glenister: Think about banks at the moment, where they’re working with customers. There’s a drive for security. Security people have always known about security and how they can enable and protect the business.

But what’s happening is that the customers are now more demanding because the media is blowing up all of the cyber crimes, threats, and hacks. The consumer is now saying they need their data to be protected.

A perfect example is my daughter, who was applying for a credit card recently. She’s going off to college. They asked her to send a copy of her passport, Social Security card, and driver’s license to them by email. She looked at me and said, “What do you think?” It’s like, “No. Why would you?”

People have actually voted, saying they’re not going to do business with that organization. If you look in the finance organizations now, banks and the credit-card companies are now looking at how to engage with the customer and show that they have been securing and protecting their data to enable new capabilities like loan or credit-card applications and protecting the customer’s data, because customers can vote with their feet and choose not to do business with you.

So, it’s become a business-enabler to say we’re protecting your data and we have your concerns at heart.

Gardner: And it’s not to say that that information shouldn’t be made available to a credit card or an agency that’s ascertaining credit, but you certainly wouldn’t do it through email.

Insecure tool

Glenister: Absolutely, because email is the biggest sharing tool on the planet, but it’s also one of the most insecure tools on the planet. So, why would you trust your data to it?

Steffen: We’ve talked about security awareness, the security awareness culture, and security awareness programs. If you have a vendor management program and you’re subject to a vendor management from some other entity, one of the things they also would request is that you have a security awareness program?

Even five to seven years ago, people looked at that as drudgery. It was the same thing as all the other nonsensical HR training that you have to look at. Maybe, to some extent, it still is, but the reality is that when I’ve given those programs before, people are actually excited. It’s not only because you get the opportunity to understand security from a business perspective, but a good security professional will then apply that to, “By the way, your email is not secured here, but your email is not secured at home, too. Don’t be stupid here, but don’t be stupid there either.”

We’re going to fix the router passwords. You don’t need to worry about that, but you have a home router, change the default password. Those sounds like very simple straightforward things, but when you share that with your employees and you build that culture, not only do you have more secure employees, but then the culture of your business and the culture of security changes.

In effect, what’s happening is that you’ll finally be getting to see that translate into stuff going on outside of corporate America. People are expecting to have information security parameters around the businesses that they do business with. Whether it’s from the big-box store, to the banks, to the hospitals, to everybody, it really is starting to translate.

Glenister: Security is a culture. I look at a lot of companies for whom we do once-a-year certification or attestation, an online test. People click through it, and some may have a test at the end and they answer the questions and that’s it, they’re done. It’s nice, but it has to be a year-round, day-to-day culture with every organization understanding the implications of security and the risk associated with that.

If you don’t do that, if you don’t embed that culture, then it becomes a one-time entity and your security is secure once a year.

Steffen: We were talking about this before we started. I’m a firm believer in security awareness. One of the things that I’ve always done is take advantage of these pretend Hallmark holidays. The latest one was Star Wars Day. Nearly everybody has seen Star Wars or certainly heard of Star Wars at some point or another, and you can’t even go into a store these days without hearing about it.

For Star Wars Day, I created a blog to talk about how information-security failures led to the downfall of the Galactic Empire.

Critical Security
And Compliance Considerations
For Hybrid Cloud Deployments

It was a fun blog. It wasn’t supposed to be deadly serious, but the kicker is that we talked about key information security points. You use that holiday to get people engaged with what’s going on and educate them on some key concepts of information security and accidentally, they’re learning. That learning then comes to the next blog that you do, and maybe they pay a little bit more attention to it. Maybe they pay attention to simply piggybacking through the door and maybe they pay attention to not putting something in an e-mail and so on.

It’s still a little iterative thing; it’s not going to happen overnight. It sounds silly talking about information security failures in Star Wars, but those are the kind of things that engage people and make people understand more about information security topics.

Looking to the future

Gardner: Before we sign off, let’s put on our little tinfoil hat with a crystal ball in front. If we’ve flipped in the last six months or so, people now see the cloud as inherently more secure, and they want to partner with their cloud provider to do security better. Let’s go out a year or two, how impactful will this flip be? What are the implications when we think about this, and we take into consideration what it really means when people think that cloud is the way to go to be secure on the internet?

Steffen: The one that immediately comes to mind for me — Intralinks is actually starting to do some of this — is you’re going to see niche cloud. Here’s what I mean by niche cloud. Let’s just take some random regulatory body that’s applicable to a certain segment of business. Maybe they can’t go to a general public cloud because they’re regulated in a way that it’s not really possible.

What you’re going to see is a cloud service that basically says, “We get it, we love your type, and we’re going to create a cloud. Maybe it will cost you a little bit more to do it, but we understand from a compliance perspective the hell that you are going through. We want to help you, and our cloud is designed specifically to address your concerns.”

When you have niche cloud, all of a sudden, it opens up your biggest inherent barriers. We’ve already talked about security. Compliance is another one, and compliance is a big fat ugly one. So, if you have a cloud provider that’s willing to maybe even assume some of the liability that comes with moving to their cloud, they’re the winners. So let’s talk 24 months from now. I’m telling you that that’s going to be happening.

Gardner: All right, we’ll check back on that. Daren, your prediction?

Glenister: You are going to see a shift that we’re already seeing, and Chris will probably see this as well. It’s a shift from discussions around security to transformation. You definitely see security now transforming business, enabling businesses to do things and interact with their customs in ways they’ve never done before.

You’ll see that impacting two ways. One is going to be new business opportunities, so revenue coming in, but it’s also going to be streamlined in the internal processes, so making things easier to do internally. And you’ll see a transformation of the business inside and outside. That’s going to drive a lot of new opportunities and new capabilities and innovations we’ve seen before.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: Hewlett Packard Enterprise.

You may also be interested in:

Posted in Cloud computing, HP, Security | Tagged , , , , , , , , | Leave a comment

How software-defined storage translates into just-in-time data center scaling and hybrid IT benefits

The next BriefingsDirect Voice of the Customer case study examines how hosting provider Opus Interactive adopted a software-defined storage approach to better support its thousands of customers.

We’ll learn how scaling of customized IT infrastructure for a hosting organization in a multi-tenant environment benefits from flexibility of modern storage, unified management, and elastic hardware licensing. The result is gaining the confidence that storage supply will always meet dynamic hybrid computing demand — even in cutting-edge hosting environments.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

To describe how massive storage and data-center infrastructure needs can be met in a just-in-time manner, we’re joined by Eric Hulbert, CEO at Opus Interactive in Portland, Oregon. The discussion is moderated by BriefingsDirect’s Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: What were the major drivers when you decided to re-evaluate your storage, and what were the major requirements that you had?

Hulbert: Our biggest requirement was high-availability in multi-tenancy. That was number one, because we’re service providers and we have to meet the needs of a lot of customers, not just a single enterprise or even enterprises with multiple business groups.

Hulbert

So we were looking for something that met those requirements. Cost was a concern as well. We wanted it to be affordable, but needed it to be enterprise-grade with all the appropriate feature sets — but most importantly it would be the scale-out architecture.

We were tired of the monolithic controller-bound SANs, where we’d have to buy a specific bigger size. We’d start to get close to where the boundary would be and then we would have to do a lift-and-shift upgrade, which is not easy to do with almost a thousand customers.

Ultimately, we made the choice to go to one of the first software-defined storage architectures, which is a company called LeftHand Networks, later acquired by Hewlett Packard Enterprise (HPE), and then some 3PAR equipment, also acquired by HPE. Those were, by far, the biggest factors while we made that selection on our storage platform.

Gardner: Give us a sense of the scale-out requirements.

Hulbert: We have three primary data centers in the Pacific Northwest and one in Dallas, Texas. We also have the ability for a little bit of space in New York, for some of our East Coast customers, and one in San Jose, California. So, we have five data centers in total.

Gardner: Is there a typical customer, or a wide range of customers?

Big range

Hulbert: We have a pretty big range. Our typical customers are in finance and travel and tourism, and the hospitality industries. There are quite a few in there. Healthcare is a growing vertical for us as well.

Then, we rounded out with manufacturing and little bit of retail. One of our actual verticals, if you could call it vertical, are the MSPs and IT companies, and even some VARs, that are moving into the cloud.

We enable them to do their managed services and be the “boots on the ground” for their customers. That spreads us into the tens of thousands of customers, because we have about 30 to 25 MSPs that work with us throughout the country, using our infrastructure. We just provide the infrastructure as a service, and that’s been a pretty growing vertical for us.

Software Defined Storage
Eliminate Complexity and Free Infrastructure
From the Limitations of Dedicated Hardware

Gardner: And then, across that ecosystem, you’re doing colocation, cloud hosting, managed services? What’s the mix? What’s the largest part of the pie chart in terms of the services you’re providing in the market?

Hulbert: We’re about 75 percent cloud hosting, specifically a VMware-based private cloud, a multi-tenant private cloud. It’s considered public cloud, but we call it private cloud.

We do a lot of hybrid cloud, where we have customers that are doing bursting into Amazon or [Microsoft] Azure. So, we have the ability to get them either Direct Connect Amazon connections or Azure ExpressRoute connections into any of our data centers. Then, 20 percent is colocation and about 5 percent for back-up, and disaster recovery (DR) rounds that out.

Gardner: Everyone, it seems, is concerned about digital disruption these days. For you, disruption is probably about not being able to meet demand. You’re in a tight business, a competitive business. What’s the way that you’re looking at this disruption in terms of your major needs as a business? What are your threats? What keeps you up at night?

Still redundant

Hulbert: Early on, we wanted a concurrently maintainable infrastructure, which also follows through with the data centers that we’re at. So, we needed Tier 3-plus facilities that are concurrently maintainable. We wanted the infrastructure be the same. We’re not kept up at night, because we can take an entire section of our solution offline for maintenance. It could be a failure, but we’re still redundant.

It’s a little bit more expensive, but we’re not trying to compete with the commodity hosting providers out there. We’re very customized. We’re looking for customers that need more of that high-touch level of service, and so we architect these big solutions for them — and we host with a 100 percent up-time.

The infrastructure piece is scalable with scale-out architecture on the storage side. We use only HP blades, so that we just keep stacking in blades as we go. We try to stay a couple of blade chassis ahead, so that we can take pretty large bursts of that infrastructure as needed.

That’s the architecture that I would recommend for other service providers looking for a way to make sure they can scale out and not have to do any lift-and-shift on their SAN, or even the stack and rack services, which take more time.

We have to cable all of them versus needing to do one-blade chassis. Then, you can just slot in 16 blades quickly, as you’re scaling. That allows you to scale quite a bit faster.

Gardner: When it comes to making the choice for software-defined, what has that gotten you? I know people are thinking about that in many cases — not just service providers, but enterprises. What did service-defined storage get for you, and are you furthering your software-defined architecture to more parts of your infrastructure?

Hulbert: We wanted it to be software-defined because we have multiple locations and we wanted one pane of glass. We use HPE OneView to manage that, and it would be very similar for an enterprises. Say we have 30 remote offices, they want to put the equipment there, and the business units need to provision some service and storage. We want to be going to each individual appliance or chassis or application in one place to provision it all.

Since we’re dealing now with nearly a thousand customers — and thousands and thousands of virtual servers, storage nodes, and all of that, the chunklets of data are distributed across all these. Being able to do that from one single pane of the glass from a management standpoint is quite important for us.

So, it’s that software-defined aspect, especially distributing the data into chunklets, which allows us to grow quicker, and putting a lot of  automation on the back-end.

We only have 11 system administrators and engineers on our team managing that many servers, which shows you that our density is pretty high. That only works well if we have really good management tools, and having it software-defined means fewer people walking to and from the data center.

Even though our data centers are manned facilities, our infrastructure is basically lights out. We do everything from remote terminals.

Gardner: And does this software-defined extend across networking as well? Are you hyper-converged, converged? How would you define where you’re going or where you’d like to go?

Converged infrastructure

Hulbert: We’re not hyper-converged. For our scale, we can’t get into the prepackaged hyper-converged product. For us, it would be more of a converged infrastructure approach.

As I said, we do use the c-Class blade chassis with Virtual Connect, which is software-defined networking. We do a lot of VLANs and things like that on the software side.

We till have some outside of that out of band, networking, the network stacks, because we’re not just a cloud provider. We also do colocation and a lot of hybrid computing where people are connecting between them. So, we have to worry about Fibre Channel on iSCSI and connections in SAN.

That adds a couple of other layers that are a few extra management steps, but in our scale, it’s not like we’re adding tens of thousands of servers a day or even an hour, as I’m sure Amazon has to. So we can take that one small hit to pull that portion of the networking out, and it works pretty good for us.

Software Defined Storage
Eliminate Complexity and Free Infrastructure
From the Limitations of Dedicated Hardware

Gardner: How do you see the evolution of your business in terms of moving past disruption, adopting these newer architectures? Are there types of services, for example, that you’re going to be able to offer soon or in the foreseeable future, based on what you’re hearing from some of the vendors?

Hulbert: Absolutely. One of the first ones I mentioned earlier was the ability for customers that want to burst into public cloud to be able to do the Amazon Direct Connects. Even with the telecom providers back on, you’re looking at 15 to 25 milliseconds latency. For some of these applications, that’s just too much latency. So, it’s not going to work.

Now, with the most recent announcement from Amazon, they put a physical Direct Connect node in Oregon, about a mile from our data-center facility. It’s from EdgeConneX, who we partnered with.

Now, we can offer the lowest latency for both Amazon and Azure ExpressRoute in the Pacific Northwest, specifically in Oregon. That’s really huge for our customers, because we have some that do a lot of public-cloud bursting on bold platforms. So that’s one new offering we are doing.

Disruption, as we’ve heard, is around containers. We’re launching a new container-as-a-service platform later this year based on ContainerX. That will allow us to do containers for both Windows or Starnix platforms, regardless of what the developers are looking for.

We’re targeting developers, DevOps guys, who are looking to do microservices to take their application, old or new, and architect it into the containers. That’s going to be a very disruptive new offering. We’ve been working on a platform for a while now because we have multiple locations and we can do the geographic dispersion for that.

I think it’s going to take a little bit of the VMware market share over time. We’re primarily a VMware shop, but I don’t think it’s going to be too much of an impact to us. It’s another vertical we’re going to be going after. Those are probably the two most important things we see as big disruptive factors for us.

Hybrid computing

Gardner: As an organization that’s been deep into hybrid cloud and hybrid computing, is there anything out there in terms of the enterprises that you think they should better understand? Are there any sort of misconceptions about hybrid computing that you detect in the corporate space that you would like to set them straight on?

Hulbert: The hybrid that people typically hear about is more like having on-premises equipment. Let’s say I’m a credit union and I’ve got one of the bank branches that we decided to put three or four cabinets of our equipment and one on the vaults. Maybe they’ve added one UPS and one generator, but it’s not to the enterprise level, and they’re bursting to the public cloud for the things that makes sense to meet their security requirements.

To me, that’s not really the best use of hybrid IT. Hybrid IT is where you’re putting what used to be on-premises in an actual enterprise-level, Tier 3 or higher data center. Then, you’re using either a form of bursting into private dedicated cloud from a provider in one of those data centers or into the public cloud, which is the most common definition of that hybrid cloud. That’s what I would typically define as hybrid cloud and hybrid IT.

Gardner: What I’m hearing is that you should get out of your own data center, use somebody else’s, and then take advantage of the proximity in that data center, the other cloud services that you can avail yourself of.

Hulbert: Absolutely. The biggest benefit to them is at their individual location or bank branches. This the scenario where we use the credit union. They’re going to have maybe one or two telco providers, and they’re going to be their 100 or maybe 200 Mb-per-second circuits.

They’re paying a pretty premium for them, and now when they get into one of these data centers, they’re going to have the ability to have 10-gig or even 40- or 100-gig connected internet pipes with a lot higher headroom for connectivity at a better price point.

On top of that, they’ll have 10-gig connection options into the cloud, all the different cloud providers. Maybe they have an Oracle stack that they want to put on an Oracle cloud some day along with their own on- premises. The hybrid things get more challenging, because now, they’re not going to get the connectivity they need. Maybe they want to be into the software, they want to do an Amazon or Azure, or maybe they want a Opus cloud.

They need faster connectivity for that, but they have equipment that still has usable life. Why not move that to an enterprise-grade data center and not worry about air conditioning challenges, electrical problems, or whether it’s secure.

All of these facilities, including ours, have every checkbox for compliance and auditing that happens on an annual basis. Those things that used to be really headaches aren’t core of their business. They don’t do those any more. Focus on what’s core, focus on the application and their customers.

Gardner: So proximity still counts, and probably will count for an awfully long time. You get benefits from taking advantage of proximity in these data centers, but you can still have, as you say, what you consider core under your control, under your tutelage and set up your requirements appropriately?

Mature model

Hulbert: It really comes down to the fact that the cloud model is very mature at this point. We’ve been doing it for over a decade. We started doing cloud before it was even called cloud. It was just virtualization. We launched our platform in late 2005 and it proved out, time and time again, with 100 percent up-time.

We have one example of a large customer, a travel and tourism operator, that brings visitors from outside the US to the US. They do over a $1 billion a year in revenue, and we host their entire infrastructure.

It’s a lot of infrastructure and it’s a very mature model. We’ve been doing it for a long time, and that helps them to not worry about what used to be on-premises for them. They moved it all. A portion of it is colocated, and the rest is all on our private cloud. They can just focus on the application, all the transactions, and ultimately on making their customers happy.

Gardner: Going back to the storage equation, Eric, do you have any examples of where the storage software-defined environment gave you the opportunity to satisfy customers or price points, either business or technical metrics that demonstrate how this new approach to storage particularly fills out this costs equation?

Hulbert: In terms of the software-defined storage, the ability to easily provision the different sized data storage we need for the virtual servers that are running on that is absolutely paramount.

We need super-quick provisioning, so we can move things around. When you add in the layers of VMware, like storage vMotion, we can replicate volumes between data centers. Having that software-defined makes that very easy for us, especially with the built-in redundancy that we have and not being controller-bound like we mentioned earlier on.

Those are pretty key attributes, but on top of that , as customers are growing, we can very easily add more volumes for them. Say they have a footprint in our Portland facility and want to add a footprint in our Dallas, Texas facility and do geographic load balancing. It makes it very easy for us to do the applications between the two facilities, slowly adding on those layers as customers need to grow. It makes that easy for them as well.

Software Defined Storage
Eliminate Complexity and Free Infrastructure
From the Limitations of Dedicated Hardware

Gardner: One last question, what comes next in terms of containers? What we’re seeing is that containers have a lot to do with developers and DevOps, but ultimately I’d  think that the envelope gets pushed out into production, especially when you hear about things like composable infrastructure. If you’ve been composing infrastructure in the earlier part of the process and development, it takes care of itself in production.

Do you actually see more of these trends accomplishing that where production is lights-out like you are, where more of the definition of infrastructure and applications, productivity, and capabilities is in that development in DevOps stage?

Virtualization

Hulbert: Definitely. Over time, it is going to be very similar to what we saw when customers were moving from dedicated physical equipment into the cloud, which is really virtualization.

This is the next evolution, where we’re moving into containers. At the end of the day, the developers, the product managers for the applications for whatever they’re actually developing, don’t really care what and how it all works. They just want it to work.

They want it to be a utility consumption-based model. They want the composable infrastructure. They want to be able to get all their microservices deployed at all these different locations on the edge, to be close to their customers.

Containers are going to be a great way to do that because they have all the overhead of dealing with the operations knowledge. So, they can just put these little APIs and the different things that they need where they need it. As we see more of that stuff pushed to the edge to get the eyeball traffic, that’s going to be a great way to do that. With the ability to do even further bursting and into the bigger public clouds worldwide, I think we can get to a really large scale in a great way.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: Hewlett Packard Enterprise.

 You may also be interested in:

Posted in Hewlett Packard Enterprise, HP, Software-defined storage | Tagged , , , , , , , , | Leave a comment

How IT innovators turn digital disruption into a business productivity force multiplier

The next BriefingsDirect business innovation thought leadership panel discussion examines how digital business transformation has been accomplished by several prominent enterprises. We’ll explore how the convergence of cloud, mobility, and big-data analytics has prompted companies to innovate and produce new levels of award-winning productivity.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

To learn how these trend-setters create innovation value, we’re joined by some finalists from the Citrix Synergy 2016 Innovation Awards Program: Olaf Romer, Head of Corporate IT and group CIO at Bâloise in Basel, Switzerland; Alan Crawford, CIO of Action for Children in London, and Craig Patterson, CEO of Patterson and Associates in San Antonio, Texas. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: Olaf, what are the major trends that drove you to reexamine the workplace conceptually, and how did you arrive at your technology direction for innovating in that regard?

Romer: First of all, we’re Swiss traditional insurance. So, our driver was to become a little bit more modern to get the new generation of people in our company. In Switzerland, this is s a little bit of problem. We also have big companies in Zurich, for example. So, it’s very important for us.

Romer

We did this in two directions. One direction is on the IT side, and the other direction is on the real-estate side. We changed from the traditional office boxes to a flex office with open space, like Google has. Nobody has their own desk, not even me. We can go anywhere in our office and sit with whom we think it’s necessary. This is also on the IT side. We go in this direction to go for more mobility, an easier way to work in our company.

Gardner: And because you’re an insurance organization, you have a borderless type of enterprise, where you need to interact with field offices, other payers, suppliers, and customers, of course.

Was that ability to deal with many different types of end-point environments also a concern, and how did you solve that?

Romer: The first step was inside our company, and now, we want to go outside to our brokers and to our customers. The security aspect is very, very important. We’re still working on being absolutely secure, because we’re handling sensitive customer data. We’re still in the process of opening our ecosystem outward to the brokers and customers, but also to other companies we work with. [See related post, Expert panel explores the new reality for cloud security and trusted mobile apps delivery.]

Gardner: Alan, tell us about Action for Children and what you’ve been doing in terms of increasing the mobile style of interactions in business.

Crawford: Action for Children is a UK charity. It helps 300,000 children, families, and young people every year. About 5,000 staff, operate from between 300 and 500 branches. So, 300 are our own and a couple of hundred locations are with our partner agencies.

Crawford

When I started there, the big driver was around security and mobility. A lot of the XP computers were running out of support, and the staff outside the office was working on paper.

There was a great opportunity in giving modern tablets to staff to improve the productivity. Productivity in our case means that if you spend less time doing unnecessary visits or do something in one visit instead of three, you can spend more quality time with the family to improve the outcomes for the children.

Gardner: And, of course, as a non-profit organization, costs are always a concern. We’ve heard an awful lot here at Citrix Synergy about lower cost client and endpoint devices. Has that been a good news to your ears? [Learn more about Citrix Synergy 2016.]

Productivity improvements

Crawford: It has. We started with security and productivity as being the main drivers, but actually, as we’ve rolled out, we’ve seen those productivity improvements arise. Now, we’re looking at the cost, about the savings we can make on travel, print, and stationery. Our starting budget this year is £1.3 million ($1.7 million) less than it was the year before we introduced tablets for those things. We’re trying to work out exactly how much of that we can attribute to the mobile technology and how much of that is due to other factors.

Gardner: Craig, you’re working with a number of public sector organizations. Tell us about what they are facing and what mobility as a style of work means to them.

Patterson: Absolutely. I’m working with a lot of public housing authorities. One is Lucas Metropolitan, and other is Hampton Redevelopment Agency. What they’re facing is declining budgets and a need to do more with less.

Patterson

When we look at traditional housing-authority and government-service agencies that are paper-based, paper just continues to multiply. You put one piece in the copier and 20 pieces come out. So, being able to take the documents that contain secure private information of our clients and connect those with the clients out in the field is why we need mobility and efficiency and workflows.

And the cloud is what came to mind with that. With content management, we can capture data out in the field. We can move our staff out in the field. We don’t have to bring all of the clients into the office, which can sometimes pose a hardship, especially for elderly, disabled, and many of those in the greatest need. Mobility and efficiency with the cloud and the security have become paramount in how we perform our business.

Gardner: I suppose another aspect of mobility is the ability to bring data in analytics to the very edge. Have you yet to take advantage of that or do you see that it’s something that you’re going to be working toward?

Patterson: We know that it’s something we’re working toward. We know from the analytics that we’ve been able to see so far that mobility is the key. For some time, people have thought that we can’t put online things like applications for affordable housing, because people don’t have access to the Internet.

Our analytics prove that entirely wrong. Age groups of 75 and 80 were accessing it on mobile devices faster than the younger group was. What it means is that they find a relative, a grandchild or whoever they need that allows them to access the Internet. It’s been our mindset that has kept us from making the internet and those mobility avenues into our systems available on a broader scale. So, we’re moving in that direction so that self service to that community can be displayed more in a broader context.

Measuring outcomes

Crawford: On the analytics and how that’s helped by the mobile working, we had a very similar result in Action for Children in the same year we brought out tablets. We started to do outcome measures with the children we were with. To reach a child, we do a baseline measure when we first meet the family, and then maybe three months later, whatever the period of the intervention, we do a further measure.

Doing that directly on a tablet with the family present has really enhanced the outcome measures. We now have measures on 50,000 children and we can aggregate that, see what the trends are, see what the patterns are geographically by types of service and types of intervention.

Gardner: So it’s that two-way street; the more data and analytics you can bring down to the edge, the more you can actually capture and reapply, and that creates a virtuous cycle of improvement in productivity.

Crawford: Absolutely. In this case, we’re looking at the data and learning lessons about what works better to improve the outcomes for disadvantaged children, which is really what we’re about.

Gardner: Olaf, user experience is a big topic these days, and insurance, going right to the very edge of where there might be a settlement event of some sort, back to the broker, back to the enterprise. User experience improvements at every step of that means ultimately a better productive outcome for your end-customers. [See related post, How the Citrix Technology Professionals Program produces user experience benefits from greater ecosystem collaboration.]

How does user experience factor into this mobility and data in an analytics equation?

Romer: First of all, the insurance business is a little bit different business than the others here. The problem is that our customers normally don’t want to touch us during the year. They get a one-time invoice from us and they have to pay the premium. Then, they hope, and we also hope, that they will not have a claim.

We have only one touch a year, and this is little bit of problem. We try to do everything to be more attractive for the customer to get them to us, so that for them it’s clear if they have a problem or need a new insurance, they go to Bâloise Insurance.

We’re working on it to bring a little bit of consumerization. In former years the insurance business was very difficult and it wasn’t transparent. The customers have to answer 67 questions before they can take out insurance with us, and this is the point. To make it as simple as possible and to work with a new technology, we have to be attractive for the customers, like taking out insurance through an iPhone. That’s not so easy.

If you talk with a core insurance guy to calculate the premiums, they won’t already have the 67 answers from the customers.  So, it’s not only the technology, but working a little bit in a differently in the insurance business. The technology will also help us there. For me, the buzzword is big data, and now we have to bring out the value of the data we have in our business, so that we can go directly with the right user interface to the right customer area.

Gardner: Another concept that we have heard quite a bit at Synergy is the need to allow IT to say yes more often. Starting with you Craig, what are you seeing in the trends and in the technology that is perhaps most impactful for you to be able to say yes to the requests and the need for agility in these businesses, in these public sector organizations?

Device agnosticism

Patterson: It’s the device agnosticism, where you bring your own device (BYOD). It’s a device that the individuals are already familiar with. I’m going to take it from two angles. It could be the employee that’s delivering a service out to a customer in the field that can bring their own device, or a partner or contractor, so that we can integrate and shrink-wrap certain data. We will still have data security while they’re deploying or doing something out in the field for us. It could be inspections, customer service, medical, etc.

But then, on the client end, they have their own device. By our being able to deliver products through portals that don’t care what device they have, it’s based on mobile protocols and security. Those are the types of trends that are going to allow us to collect the big analytics, know what we think we know, and find out whether we really know it or not and find it, get the facts for it.

The other piece of it though is to make it easy to access the services that we provide to the community, because now it’s a digital community; it’s not just the hardcore community. To see people in a waiting line now for applications hurts my feelings. We want to see them online, accessing it 24×7, when it makes sense for them. Those are the types of services that I see becoming the greater trends in our industry.

Gardner: Alan, what allows you to say “yes” more often?

Crawford: When I started with the XP laptops, we were saying no. So doing lot of comparisons in program within our center now, they’re using the tablets and the technology. You have closed Facebook groups with those families. There’s now peer support outside hours, when children are going to bed, which is often when they have issues in a family.

They use Eventbrite, the booking app. There are some standard off-the-shelf apps, but the real enterprise in our service in a rural community currently tells everybody in that community what services they’re running through posters and flyers that were printed off. That moved to developing our own app. The prototypes are already out there, and the full app will be out there in a few weeks time. We’re saying yes to all of those things. We want to support them. It is not just yes, but yes and how can we help you do that.

Gardner: Olaf, of course, productivity is only as good as the metrics that we need to convince the higher-ups in the board room that we need more investment or that we’re doing good work with our technology. Do you have any measurements, metrics, even anecdotes about how you measure productivity and what you’ve done to modernize your workspaces?

Romer: Yes, for us it’s the feedback from the people. It’s very difficult to measure it on a clear technology level, but feedback from the people is very good and very important for us. You can see  with the BYOD we introduced one and a half years ago, a stronger cultural change in collaboration. We work together much more efficiently in the company and in the different departments.

In former times, we had closed file shares, and I couldn’t see the files of the department next to me. Now, we’re working completely in a modern collaboration way. Still, on traditional insurances, let’s say with the government, it’s very hard for them to work in the new style..

In the beginning, there were very strong concerns about that, and now we’re in a cultural shift on this. We get a lot of good feedback that in project teams, or in the case of some problems or issues, we can work much better and faster together.

Metrics of success

Gardner: Craig, of course it’s great to say yes to your constituents, but it’s also good to say that we’re doing more with less to your higher-ups and those that control the budget. Any metrics of success that you can recall in some of the public-sector organizations you’re working with?

Patterson: Absolutely. I’ll talk about files in workflow. When a document comes into the organization before, we mapped how much time and money it took to get it in a file folder, having been viewed by everyone that it needs to get viewed by. To give quick context, before, a document took a file folder, a label maker, copy machine, and every time a person needed to put a document in that folder, someone had to get it there. Now, the term “file clerk” is actually becoming obsolete.

When a document come in, it gets scanned, it’s instantaneously put in the correct order in the right electronic folder, and an electronic notification is sent to the person who needs to know. That happens in seconds. When you look at each month, it amounts to savings; before, we were managing files, rather than assisting people.

The metrics are in the neighborhood of just about 75 percent paper reduction, because people aren’t making copies. This means they’re not going to the copy machine and along the way, the water-cooler and conversation pits. That also abates some of the efficiencies. We can now see how many file folders you looked at, how many documents you actually touched, read, and reviewed in comparison with somebody else.

We had as many as five documents, in comparison with 1,700 in a month. That starts to tell you some things about where your workload is shifting. Not everyone likes that. They might consider it a little bit “big brother,” but we need those analytics to know how best to change our workflows to serve our customer, and that’s the community.

Gardner: I don’t know if this is a metric that’s easy to measure, but less bureaucracy would be something that I think just about everyone would be in favor of. Can you point to something that says we’re able to reduce bureaucracy through technology?

Patterson: When you look at bureaucracy and unnecessary paper flows, there are certain yes-and-no questions that are part of bureaucracy. Somebody has it go their desk and their job is to stamp yes or no on it. What decision do you have to make? Well they really don’t; they just have to stamp yes. To me, that’s classic bureaucracy.

Well, if the document hits that person’s desk and it meets a certain criteria or threshold, the computer automatically and instantaneously approves it and it has a documented audit trail. That saves some of our clients in the housing-authority industry, when the auditors come and review things. But if you had to make a decision, it forced you to know how long it took you to make it. So, we can look at why is it taking so long or there are questions that you don’t need to be answering.

Gardner: So let the systems do what they do best and let the people do the exception management and the value-added activities. Alan, you had some thoughts about metrics of success of bureaucracy or both?

Proxy measure

Crawford: Yes, it’s the metrics. The Citrix CEO [Kirill Tatarinov] talked at Citrix Synergy about productivity actually going down in the last few years. We’ve put all these tablets out there and we have individual case studies where we know a particular family-support worker has driven 1,700 miles in the year with the tablet, and it was 3,400 miles in the year without. That’s a proxy measure of how much time they’re spending on the road, and we have all the associated cost of fuel and wasted time and effort.

We’ve just installed an app — actually I have rolled it out in the last month or so — that measures how many tablets have been switched on in the month, how much they’re been used in the day, and what they’ve been used for. We can break that down by the geographical areas and give that information back to the line managers, because they’re the people to whom it will actually make sense.

I’m right at a stage where it’s great information. It’s really powerful, but it’s actually to understand how many hours a day they should be using that tablet. We’re not quite sure, and it probably varies from one type of service to another.

We look at those trends over a period of months. We can tell managers that, yes, total staff used them 90 percent, but it’s 85 percent in yours. All managers, I find, are fairly competitive.

Gardner: Well, that may be a hallmark of business agility, when you can try things out, A/B testing. We’ll try this, we’ll try that, we don’t pay a penalty for doing that. We can simply learn from it and immediately apply our lesson back to the process.

Crawford: It’s all about how we support those areas where we identify that they’re not making the most of the technology they’ve been given. And it might be human factors. The staff or even the managers are very fearful. Or it might be technical factors. There are inhibitors around mobile network coverage and even broadband coverage in some rural areas. We just follow up on all of those user experience information we get back and try and proactively improve them.

Gardner: Olaf, when we ask enterprises where they are in their digital transformation, many are saying they’re just at the beginning. For you, who are obviously well into a digital transformation process, what lessons learned could you share; any words of advice for others as they embark on this journey?

Romer: The first digital transformation in the insurance business was in the middle of 1990s, when we started to go paperless and work with a digital system. Today, more than 90 percent of our new insurance contracts are completely paperless. In Germany, for example, you can give a digital signature. It’s not allowed for the moment in Switzerland, but from a technical perspective, we can do this.

My advice would be that digitalization gives you a good situation to think about to make it simple. We built up great complexity over the years, and now we’re able to bring this down and make it as simple as possible. We created the slogan, “Simply Safe,” for us to rethink everything that we’re doing to make it simple and safe. Again, for insurance, it’s very important that the digitalization brings us not more complexity, but reduces it.

Gardner: Craig, digital transformation, lessons learned, what advice can you offer others as they embark?

Document and workflow

Patterson: In digital transformation, I’ll just use document and workflow. Start with the higher-end items; there’s low-hanging fruit there. I don’t know if we’ll ever be totally paperless, which would really allow us to go mobile, but at the same time, know what not to scan. Know what to archive and just get rid off. And don’t hang on to old technologies for too long. That’s something else that’s starting to happen. The technological revolution in lifecycle of technology is shorter and we need to plan our strategies along those lines.

Gardner: Alan, words of advice on those also interested in digital transformation?

Crawford: For us, it started about connecting with our cause. We’ve got social care staff and since we’re going to do digital transformation, it’s not going to really enthuse them. However, if you explain that this is about actually improving the lives of children with technology, then they start to get interested. So, there is a bit about using your cause and relating the change to your cause.

A lot of our people factors are on how to engage and train. It’s no longer IT saying, “Here’s the solution, and we expect you to do ABC.” I was working with those social-care workers, and here are the options, what will work for you and how should we approach that, but then it’s never letting up.

Actually, you’ve got to follow through on all this change to get the real benefits out of it. You’ve got to be a bit tenacious with it to really see the benefits in the end.

Gardner: Tie your digital transformation and the organization’s mission that there is no daylight between them.

Crawford: We’ve got the project digitally enabling Action for Children and that was to try and link the two together inextricably.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: Citrix.

You may also be interested in:

Posted in big data, Cloud computing, data analysis | Tagged , , , , , , , , , , | Leave a comment

Infrastructure as destiny — How Purdue builds an IT support fabric for big data-enabled IoT

The next BriefingsDirect Voice of the Customer IT infrastructure thought leadership case study explores how Purdue University has created a strategic IT environment to support dynamic workload requirements.

We’ll now hear how Purdue extended a research and development IT support infrastructure to provide a common and “operational credibility” approach to support myriad types of compute demands by end users and departments.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

To describe how a public university is moving toward IT as a service, please join Gerry McCartney, Chief Information Officer at Purdue University in Indiana. The discussion is moderated by BriefingsDirect’s Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: When you’re in the business of IT infrastructure, you need to predict the future. How do you close the gap between what you think will be demanded of your infrastructure in a few years and what you need to put in place now?

McCartney: A lot of the job that we do is based on trust and people believing that we can be responsive to situations. The most effective way to show that right now is to respond to people’s issues today. If you can do that effectively, then you can present a case that you can take a forward-looking perspective and satisfy what you and they anticipate to be their needs.

McCartney

I don’t think you can make forward-looking statements credibly, especially to a somewhat cynical group of users, if you’re not able to satisfy today’s needs. We refer to that as operational credibility. I don’t like the term operational excellence, but are you credible in what you provide? Do people believe you when you speak?

Gardner: We hear an awful lot about digital disruption in other industries. We see big examples of it in taxi cabs, for example, or hospitality. Is there digital disruption going on at university campuses as well, and how would you describe that?

McCartney: A university you can think of as consisting of three main lines of business, two of which are our core activities, of teaching students, educating students; and then producing new knowledge or doing research. The third is the business of running that business, and how do you do that. A very large infrastructure is built up around that third leg, for a variety of reasons.

But if we look at the first two, research in particular, which is where we started, this concept of the third leg of science has been around for some time now. It used to be just experimentation and theory creations. You create a theory, then you do an experiment with some test tubes or something like this, or grow a crop in the field. Then, you would refine your theory and you would continue in that kind of dyadic mode of just going backward and forward.

Third leg of science

That was all right until we wanted to crash lorries into walls or to fly a probe into the sun. You don’t get to do that a thousand times, because you can’t afford it, or it’s too big or too small. Simulation has now become what we refer to as the third leg of science.

Slightly more than 35 percent of our actual research now uses high-performance computing (HPC) in some key parts of it to produce results, then shape the theory formulation, and the actual experimentation, which obviously still goes on.

Around teaching, we’ve seen for-profit universities, and we’ve seen massive open online courses (MOOCs) more recently. There’s a strong sense that the current mode of instructional delivery cannot stay the same as it has been for the last hundreds of years and that it’s ripe for reform.

Indeed, my boss at Purdue, Mitch Daniels, would be a clear and vibrant voice in that debate himself. To go back to my earlier comments, our job there is to be able to provide credible alternatives, credible solutions to ideas as they emerge. We still haven’t figured that out collectively as an industry, but that’s something that is in the forefront of a lot of peoples’ minds.

Gardner: Suffice to say that information technology will play a major role in that, whatever it is.

McCartney: It’s hard to imagine a solution that isn’t actually completely dependent upon information technology, for at least its delivery, and maybe for more than that.

Gardner: So, high-performance computing is a bedrock for the simulations needed in modern research. Has that provided you with a good stepping stone toward more cloud-based, distributed computing-based fabric, and ultimately composable infrastructure-based environments?

McCartney: Indeed it has. I can go back maybe seven or eight years at our place, and we had close to 70 data centers on our campus. And by a data center, I mean a room with at least 200-amp supply, and at least 30 tons of additional cooling, not just a room that happens to have some computers in it. I couldn’t possibly count how many of them there are now. Those stand-alone data centers are almost all gone now, thanks to our community cluster program, and the long game is that we probably won’t have much hardware on our campus at some point a few years from now.

Right now, our principal requirement is around research computing, because we have to put the storage close to the compute. That’s just a requirement of the technology.

In fact, many of our administrative services right now are provided by cloud providers. Our users are completely oblivious to that, but we have no on-premises solution at all. We’re not doing travel, expense reimbursement and a variety of back-office things on our campus at all.

Gain Data Insights and Business Value
From the Proliferation
Of IoT Connected Devices and Machines

That trend is going to continue, and the forcing function there is that I can’t spend enough on security to protect all the assets I have. So, rather than spend even more on security and fail to provide that completely secure environment, it’s better to go to somebody who can provide that environment.

Data-compute link

Gardner: What sort of an infrastructure software environment do you think will give you that opportunity to make the right choices when you decide on-prem versus cloud, even for those intensive workloads that require a tight data and compute link?

McCartney: The worry for any CIO is that the only thing I have that’s mine is my business data. Anything else — web services, network services — I can buy from a vendor. What nobody else can provide me are my actual accounts, if you wish to just choose a business term, but that can be research information, instructional information, or just regular bookkeeping information.

When you come into a room of a new solution, you’re immediately looking at the exit door. In other words, when I have to leave, how easy, difficult, or expensive is it going to be to extract my information back from the solution?

That drives a huge part of any consideration, whether it’s cloud or on-prem or whether it’s proprietary or open code solution. When this product dies, the company goes bust, we lose interest in it, or whatever — how easy, expensive, difficult is it for me to extract my business data back from that environment, because I am going to need to do that?

Gardner: What, at this juncture, meets that requirement in your mind? We’ve heard a lot recently about container technology, standards for open-source platforms, industry accepted norms for cloud platforms. What do you think reduces your risk at this point?

McCartney: I don’t think it’s there yet for me. I’m happy to have, relatively speaking, small lines of business. Also, you’re dependent then on your network availability and volume. So, I’m quite happy there, because I wasn’t the first, and because that’s not an important narrative for us as an institution.

I’m quite happy for everybody else to knock the bumps out of the road for me, and I’ll be happy to drive along it when it’s a six-lane highway. Right now it’s barely paved, and I’ll allow other brave souls to go there ahead of me.

Gardner: You mentioned early on in our discussion the word “cynical.” Tell me a little bit about the unique requirements in a university environment where you need to provide a common, centrally managed approach to IT for cost and security and manageability, but also see to the unique concerns and requirements of individual stakeholders?

McCartney: All universities are, as they should be, full of self-consciously very smart people who are all convinced they could do a job, any particular job, better than the incumbent is doing it. Having said that, the vast bulk of them have very little interest in anything to do with infrastructure.

The way this plays out is that the central IT group provides the core base that services the network — the wireless services, base storage, base compute, things like that. As you move to the edge, the things that make a difference at the edge.

Providing the service

In other words, if you have a unique electrical device that you want to plug in to a socket in the wall because you are in paleontology, cell biology, or organic chemistry, that’s fine. You don’t need your own electricity generating plants to do that. I can provide you with the electricity. You just need the cute device and you can do your business, and everybody is happy.

Whatever the IT equivalent to that is, I want to be the energy supplier. Then, you have your device at the edge that makes a difference for you. You don’t have to worry about the electricity working; it’s just there. I go back to that phrase “operational credibility.” Are we genuinely surprised when the service doesn’t work? That’s what credibility means.

Gardner: So, to me, that really starts to mean IT as a service, not just electricity or compute or storage. It’s really the function of IT. Is that in line with your thinking, and how would you best describe IT as a service?

McCartney: I think that’s exactly right, Dana. There are two components to this. There’s an operational component, which is, are you a credible provider of whatever the institution decides the services are that it needs, lighting, air-conditioning or the IT equivalence of that? They just work. They work at reasonable cost; it’s all good. That’s the operational component.

The difference with IT, as opposed to other infrastructure components, is that IT has itself the capability to transform entire processes. That’s not true of other infrastructure things. I can take an IT process and completely reengineer something that’s important to me, using advantages that the technology gives me.

Gain Data Insights and Business Value
From the Proliferation
Of IoT Connected Devices and Machines

For example, I might be concerned about student performance in particular programs. I can use geo-location data about their movement. I can use network activity. I can use a variety of other resources available to me to help in the guidance of those students on what’s good behavior and what’s helpful behavior to an outcome that they want. You can’t do that with an air-conditioning system.

IT has that capability to reinvent itself and reinvent entire processes. You mentioned some of them the way that things like Uber has entirely disrupted the taxi industry. I’d say the same thing here.

There’s one part of the CIO’s job that’s operational; does everything work? The second part is, if we’re in transition period to a new business model, how involved are the IT leaders in your group in that discussion? It’s not just can we do this with IT or not, but it’s more can a CIO and the CIO’s staff bring an imagination to the conversation, that is a different perspective than other voices in the organization? That’s true of any industry or line of business.

Are you merely there as a handmaiden waiting to be told what to do, or are you an active partner in the conversation? Are you a business partner? I know that’s a phrase people like to use. There’s a kind of a great divide there.

Gardner: I can see where IT is a disruptor — and it’s also a solution to the disruptor, but that solution might further disrupt things. So, it’s really an interesting period. Tell me a little bit more about this concept of student retention using new technologies — geolocation for example — as well as big data which has become more available at much lower cost. You might even think of analytics as a service as another component of IT as a service.

How impactful will that be on how you can manage your campus, not only for student retention, but perhaps for other aspects of a smarter intelligent campus opportunity? [See related post, Nottingham Trent University Elevates Big Data’s Role to Improving Student Retention in Higher Education.]

Personalized attention

McCartney: One of the great attractions of small educational institutions is that you get a lot of personalized attention. The constraint of a small institution is that you have very little choice. There’s a small number of faculty, and they simply can’t offer the options and different concentrations that you get in a large institution.

In a large institution, you have the exact opposite problem. You have many, many choices, perhaps even too many subjects that, as a 19-year-old, you’ve never even heard of. Perhaps you get less individualized attention and you fill that gap by taking advice from students who went to your high school a year before, who are people in your residence hall, or people you bump into on the street. The knowledge that you acquire there is accidental, opportunistic, and not structured in any way around you as an individual, but it’s better than nothing.

There are advisors, of course, and there are people, but you don’t know these individuals. You have to go and form relationships with them and they have to understand you and you have to understand them.

A big-data opportunity here is to be able to look at the students at some level of individuality. “Look, this is your past, this is what you have done, this is what you think, and this is the behavior that we are not sure you’re engaging in right now. Have you thought about this path, have you thought about this kind of behavior for yourself?”

A well-established principle in student services is that the best indicator of student success is how engaged they are in the institution. There are many surrogate measures of that, like whether they participate in clubs. Do they go home every weekend, indicating they are not really engaged, that they haven’t made that transition?

Independent of your academic ability, your SAT scores, and your GPA that you got in high school, for students that engage, that behavior is highly correlated with success and good outcomes, the outcomes everybody wants.

As an institution, how do you advise or counsel. They’ll say perhaps there’s nothing here they’re interested in, and that can be a problem with a small institution. It’s very intimate. Everybody says, “Dana, we can see you’re not having a great time. Would you like to join the chess club or the drafts club?” And you say, “Well, I was looking for the Legion of Doom Club, and you don’t seem to have one here.”

Well, you go to a large institution, they probably have two of those things, but how would you find it and how would you even know to look for that? How would you discover new things that you didn’t even know you liked, because the high school you went to didn’t teach applied engineering or a whole pile of other things, for that matter.

Gardner: It’s interesting when you look at it that way. The student retention equation is, in a business sense, the equivalent of user experience, personalization, engagement, share of wallet, those sorts of metrics.

We have the opportunity now, probably for the first time, to use big data, Internet of Things (IoT), and analytics to measure, predict, and intercede at a behavioral level. So in this case, to make somebody a productive member of society at a capacity they might miss and you only have one or two chances at that, seems like a rather monumental opportunity.

Effective path

McCartney: You’re exactly right, Dana. I’m not sure I like the equivalence with a customer, but I get the point that you’re making there. What you’re trying to do is to genuinely help students discover an effective path for themselves and learn that. You can learn it randomly, and that’s nice. We don’t want to create this kind of railroad track. Well, you’re here; you’ve got to end up over there. That’s not helpful either.

My own experience, and I don’t know about other people listening to this, is that you have remarkably little information when you’re making these choices at 19 and 20. Usually, if you were getting direction, it was from somebody who had a plan for you that was more based on their experience of life, some 20 or 30 years previously than on your experience of life.

Gain Data Insights and Business Value
From the Proliferation
Of IoT Connected Devices and Machines

So where big data can be a very effective play here, was to say, “Look, here are people that look like you, and here were the choices they’ve made. You might find some of these choices interesting. If you might, then here’s how you’d go about exploring that.”

As you rightly say, and implicitly suggested, there is a concern with the high costs, especially of residential education, right now. The most wasteful expenditures there are is where you do a year or two to find out you shouldn’t have ever been in this program, you have no love for this thing, you have no affinity for it.

The sooner you can find that out for yourself and make a conscious choice the better. We see big data having a very active role in that because one of the great advantages of being in a large institution is that we have tens of thousands of students over many years. We know what those outcomes look like, and we know different choices that different people have made. Yes, you can be the first person to make a brand new choice, and good for you if you are.

Gardner: Well it’s an interesting way of looking at big data that has a major societal benefit in the offing. It also provides predictability and tools for people in ways they hadn’t had before. So, I think it’s very commendable.

Before we sign-off, what comes next – high performance computing (HPC), fabric cloud, IT-as-a service — is there another chapter on this journey that perhaps you have a bead on that that we’re not aware of?

McCartney: Oh my goodness, yes. We have an event now that I started three years ago called “Dawn or Doom,” in which if technology is a forcing function, if it is. We’re not even going to assert that definitely. Are we reaching a point of a new nirvana, a new human paradise where we’ve resolved all major social problems, and health problems or have we created some new seventh circle of hell where it’s actually an unmitigated disaster for almost everybody; if not everybody? Is this the end of life as we know it? We create robots that are superior to us in every way and we become just some intermediate form of life that has reached the end of its cycle.

This is an annual event that’s free and open. Anybody who wants to come is very welcome to attend. You can Google “Dawn or Doom Purdue.” We look at it from all different perspectives. So, we have obviously engineers and computer scientists, but we have psychologists, we have labor economists. What about the future of work? If nobody has a job, is that a blessing or a curse?

Psychologists, philosophers, what does it mean, what does artificial intelligence mean, what does a self-conscious machine mean? Currently, of course, we have things like food security we worry about. And the Zika virus — are we spawning a whole new set of viruses we have no cure for? Have we reached the end of the effectiveness of antibiotics or not?

These are all incredibly interesting questions I would think any intelligent person would want to at least probe around, and we’ve had some significant success with that.

Next event

Gardner: When is the next Dawn or Doom event, and where will it be?

McCartney: It would be in West Lafayette, Indiana, on October 3 and 4. We have a number of external high-profile key note speakers, then we have a passel of Purdue faculty. So, you will find something that entertain even the most arcane of interests. [For more on Dawn or Doom, see the book, Dawn or Doom: The Risks and Rewards of Emerging Technologies.]

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: Hewlett Packard Enterprise.

You may also be interested in:

Posted in big data, data analysis, data center, Internet of Things | Tagged , , , , , , , , , | Leave a comment

How UPS automates supply chain management and gains greater insight for procurement efficiency

The next BriefingsDirect business innovation for procurement case study examines how UPS modernizes and streamlinines its procure-to-pay processes.

Learn how UPS — across billions of dollars of supplier spend per year — automates supply-chain management and leverages new technologies to provide greater insight into procurement networks. This business process innovation exchange comes to you in conjunction with the Tradeshift Innovation Day held in New York on June 22, 2016.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. 

To explore how procurement has become strategic for UPS, BriefingsDirect sat down with Jamie Dawson, Vice-President of UPS’s Global Business Services Procurement Strategy in Atlanta. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: What are the major trends that you are seeing in procurement, and how you’re changing your strategy to adjust?

Dawson: We’re seeing a lot of evolution in the marketplace in terms of both technology and new opportunities in ways to procure goods and that really is true around the globe. We’re adjusting our strategy and also challenging some of our business partners to come along with us.

We’re a $60 billion company. Last year, our total expenses were somewhere in the $50-billion range, lots of goods and services flowing around the globe.

Gardner: And so, any way that you can find new efficiency, new spend management benefits that turns into some significant savings.

Dawson: Absolutely.

Gardner: Now that you’re looking for new strategies and new solutions, what is it in procurement that’s of most interest to you and how are you using technology in ways you didn’t before?

Collaboration and partnerships

Dawson: One of the new ways is a combination or partnerships both with third parties as well as our own internal business partners. We’re collaborating with other functions, and procurement is not something we are doing to them; we’re working with them to understand what their needs are and working with their suppliers as well.

Dawson

Gardner: We’re hearing some very interesting things these days about using machine learning and artificial intelligence, combining that with human agents who are specialized. It sounds like, in some ways, external procurement services can do the job better than anyone. Is that something that you’re open to? Is procurement as a service something you’re looking at? [See related post, How new modes of buying and evaluating goods and services disrupts business procurement — for the better.]

Dawson: Procurement-as-a-service has a certain niche play. There will always be basic buy-and-sell items, even as individuals. There are some things you don’t research, but you just go out and buy. There are other things for which you do a lot of research and you look into different solutions.

There are different things that will cause you to research more. Maybe it’s a competitive advantage, maybe you’re looking for an opportunity in a new space or a new corner of the globe. So, you’ll do a lot more research, and your solutions need to be scalable. If you create and start in Europe, maybe you’ll also want to use it in Asia. If you start in the US, maybe you want to use elsewhere.

Gardner: It sure sounds like, during a period of experimentation, that where the boundary was between things that you would buy by rote versus things you would buy with a lot of expertise or research is shifting or changing. Are you experimenting as an organization, and what is interesting to you as you look at new opportunities from those people who are in the procurement network space?

Dawson: There will always be complex areas that require solution orientation more than just price. They need a deep understanding of industry, knowledge, and partnership. There are a lot of other areas where the opportunities are expanding every day. [See related post, ChainLink analyst on how cloud-enabled supply chain networks drive companies to better manage finances, procurement.]

Gardner: As you think about what you’ve done and been able to accomplish, do you have any advice for other organizations that are also starting to think about modernizing and strategizing, rather than just doing it in the traditional old way? What would you tell them?

Dawson: Two things. One would be within the procurement organizations to be open to new ideas. And second, get the rest of the organization behind you, because you’re going to need their support.

Gardner: It seems that procurement as a function is just far more strategic than it used to be. Not only are you able to get more goods and services, but you can save significant amounts of money. Do you feel that your profile as an organization within UPS is rising or expanding in terms of the role you play in the larger organization? [See related post, CPO expert Joanna Martinez extols the virtues of redesigning procurement for strategic business agility.]

Don’t have to sell

Dawson: I’m certainly aware that the knowledge of the capabilities and the demonstrated successes are now being recognized throughout the organization. And it becomes self feeding. You actually get on a roll and can further expand the capabilities once that knowledge is out there; you don’t have to sell.

Gardner: Last question, looking to the future, on a vision level, what’s really exciting to you? What are you thinking that might be more important to you in how you do business two or three years from now? It could be technology, suppliers, ecosystems, cloud enabled intelligence, that sort of thing.

Dawson: It’s a very interesting question, because it’s almost the same answer. Your greatest fear is the greatest benefit. I listened to what we just heard on the Tradeshift Go tool, and it’s crazy how exciting that this is. You heard all the questions in the room about how to adapt that to what you already have today? The world still exists as it exists today.

So, there’s this huge transition period where we were bolting on these fantastic great ideas to our existing infrastructure. That transition into what’s new and really embracing it is the most exciting of all.

Gardner: Disruption can be good and disruption can be bad.

Dawson: It will be a challenging journey.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: Tradeshift.

You may also be interested in:

Posted in procurement | Tagged , , , , , , | Leave a comment

How the Citrix Technology Professionals Program produces user experience benefits from greater ecosystem collaboration

The next BriefingsDirect thought leadership panel discussion focuses on how expert user communities around technologies and solutions create powerful digital business improvements.

As an example, we will explore how the Citrix Technology Professionals Program, or CTPs as they are referred to, gives participants a larger say in essential strategy initiatives such as enabling mobile work styles.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

To learn more about the CTP program and how an ongoing dialogue between vendors and experts provides the best end-user experiences, we’re joined by Douglas Brown, Founder of DABCC.com in Sarasota, Florida; Rick Dehlinger, an Independent Technologist and Business Visionary in Sacramento, California; Jo Harder is the Cloud Architect at D+H and an Industry Analyst at Virtualization Practice in Fort Myers, Florida, and Steve Greenberg, President of Thin Client Computing in Scottsdale, Arizona. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: We hear so much nowadays about user experience. You might say that you, as a community-based organization, are the original user experience provider. What is the CTP program as a user group and how ultimately does your user experience translate into improvements for the Citrix community and ecosystem?

Brown: I’ve been a CTP since the conception of the CTP Program, and within the Citrix Community since 1997.

Brown

What’s neat about the CTP Program and the Citrix Community in general is that we’re able to bring a bunch of great, talented people together, and then in return, take that combined experience and knowledge and share that with other people.

What was interesting and what got me into the community way back when, was the fact that there was just no information. You were just really out on your own trying to solve problems. And when we were able to then put that in the community, we all exponentially got better.

What I’ve found through the Citrix Community in general, the Citrix Users Group that Citrix has recently started, and the CTP Program is that you’re always better together. That’s the biggest takeaway for me from not just the 10 years of CTP, but 15 or 16 years of being in the greater Citrix Community itself.

Gardner: Steve, how well and effective does this advocacy role work? How much traction are you getting?

Greenberg: It’s amazing how well it works. Doug referred to the old days. We had a 1997 to 2007 era, where you didn’t have the feedback loop, and products evolved slowly. We’d see a new product release and ask why they did that. So, this passionate group, because of the Internet, because we’re all kind of little freaks in our little neighborhoods somewhere around the world, all found each other and come together with such a passion.

Greenberg

We haven’t calculated it, but it’s in excess of 1,000 years of hands-on experience between this group of 50 or so people. It works, and Citrix has come to value it. Other companies are following the model and developing community programs. It’s really invigorating to learn something from the true end user, the customer, and bring it back to headquarters and see the products evolve and change.

Brown: It’s really a 360-degree type of program. It’s not just for us; it also benefits Citrix, and then, of course, everyone, the customer and the end engineers, what have you.

Gardner: As was mentioned, we’re in this era of social media, and people can be their own publisher and they can be an earphone and a megaphone at the same time. So Rick, do you feel like you’re representing a large group, and how do they communicate to you what they’re feeling?

Much broader audience

Dehlinger: I do feel like I represent a pretty large group, especially when you start wandering the halls of Citrix Synergy. It’s like a college or high school reunion that happens every year. I definitely feel like we represent a much broader audience.

We (the members of the CTP program) also have people who represent perspectives from various locations across the world, different industries, industry functions, different customer bases — even different seats in the ecosystem — the partner community, end user, customer, and other technology-provider companies.

Dehlinger

In terms of communication, some of the tools have evolved over the last 10 years. Steve made a good point. I hadn’t really thought about the fact that we have two different eras. The era of the last 10 years has really been one of greatly increased communication and transparency, and that’s one of the things that the CTP program is fantastic about.

[Interesting editorial note: shortly after the inception of the CTP Program in 2006, a couple of the founding CTP’s – Brian Madden and Rick Dehlinger – wrote blog articles essentially calling Citrix out for being closed off and not showing any thought leadership in the industry.

Then Citrix CEO Mark Templeton got the message loud and clear, and reversed the policy against Citrixites blogging. This was effectively the turning point between the eras Steve Greenberg mentioned, and the first big impact the CTP’s had on Citrix and the industry.]

Learn How the Citrix Technology Professionals Program Helps
Shape the Future of Cloud Computing  

Steve had mentioned that a lot of the other vendors are starting to use this (CTP Program) as a model to build their community programs around. This group of people is very passionate about Citrix Technologies and passionate about touching the lives of others. This combines the two of those (passions) and puts us behind a closed door with the opportunity to have a very real conversation and communication with the leaders, developers, product managers at Citrix.

We have impacted some very substantial and positive change in Citrix — helped them stop going down some roads that were very disastrous and recover from some decisions that started to be disastrous or were dead ends — and they ultimately improved it.

Greenberg: And to Doug’s 360-degree comment, we continue to be inspired by what they bring out and put in front of us as a possible vision; it’s incredible. Just so you understand, we’re usually locked in a room for two full days, approximately 10 to 12 hours, a couple of times a year, and it gets deep. It’s like an inside family having a family discussion that gets real hot, but it’s two-way.

Perhaps at first, it was us saying, “You have got to fix this stuff,” but now it’s inspiring to see what comes out, that they touch the community and say, “We’re thinking about this; how would that work?” It’s really, really cool.

Brown: I like the fact that Steve mentioned it’s really two different eras. Prior to the CTP Program, and I was around when they started this, we really had to push something like this for Citrix. A typical corporation back then was not about outside feedback per se. They did not blog; there was no social media. It was a very controlled message.

Nowadays, obviously they need to control the message, but it’s just wide open. It’s a wide -open world out there today.

Interactive, wide ranging

Gardner: Jo, you’re like a focus group in a sense, but interactive and wide-ranging in terms of your impact and getting information from the field. So as a focus group, what did you accomplishing recently at Citrix Synergy 2016?

Harder: Let me step back and say that we’re under NDA with Citrix. These closed-door discussions that Steve mentioned are very private discussions. The product managers go into what’s happening, what they’re thinking about for future products, and that’s really the basis for those discussions.

Harder

I never really thought about us as like a focus group, but we are. It’s really great that we can give feedback to each other. Because we have such varied experiences and expertise, there are some products that I know really well that the person sitting next to me might touch once a year. So we have complete variety in the group. It’s really great to be able to have those discussions as a focus group, if you will, and to be able to provide that feedback to the folks at Citrix and really to each other as well, because we do learn a lot from each other.

Gardner: Because Citrix has so many different lines and different products, they have inherited things through acquisition, they have built things organically, no one user consumes them in the same way. What are you seeing in terms of adoption? What would you say is the most interesting part of Citrix’s solutions in this particular day and age?

Dehlinger: The most interesting thing for me and in our little focus group is community representation. I tend to be one of the ones that advocates very heavily for the cloud, and for increasing the pace of evolution, helping drag the traditional Citrix enterprise customer base further into the new world that we live in. For me the most exciting stuff has definitely got to be the cloud.

The evolution of Citrix’s Cloud Services, now called Citrix Cloud, and all that stuff underneath it, is fantastic. It’s monumental, not just for the consumer base, but also for Citrix, because it gets them into the world of rapid prototyping and rapid evolution, consistent, evergreen products and services, and also starts to put them into a different world, where it’s cloud-based consumption and pricing.

Every day, every week, every month, every year, you have to continue to prove your value and improve your value and provide a high quality level of service. If you don’t, you’re cut off; the customer has the opportunity to walk away.

One of the things that’s most exciting about that for me is the opportunity for Citrix to evolve into the cloud first world alongside Microsoft. If you look at any of the traditional enterprise technology vendors that are out there, they’ve been selling based on a capital-expenditure model into the enterprise.

The customers go spend all these big bucks up front; these vendors’ entire ecosystems – their sales teams, even their product development cycles – they’re based off these big buys and long deployment processes. There’s so much of a company (that revolves around up-front capital expenditure and long deployment cycles), and the entire ecosystem gets tied to that.

Then, you look at the polar opposite end of that; that is the cloud, where it’s consumption-based pricing, the attributes that I mentioned a little bit earlier.

Adoption patterns

Gardner: So it could be quite interesting on adoption patterns. We could be seeing all sorts of new models popping up, and that could be interesting for companies as well as the end user organizations.

Dehlinger: In my mind, it increases the transparency on both sides. Citrix knows and understands who is using what, and what they are not using also. The customer has an opportunity to vote with their dollars, not just once upfront when they are seeing all the stars of the sales pitch, but on a monthly or a yearly basis.

That’s actually the most exciting part to me, because Microsoft has made that pivot now, with Office 365 and Azure and all that product family. They’ve brought their ecosystem around and they’re showing the world now that it’s possible to evolve from being a traditional enterprise software/ technology vendor to being a cloud service provider.

So, it’s exciting for me. What I see as the future of Citrix and of the community is Citrix getting over that hump themselves and really getting into it. They have reinvented themselves many times over the years.

Gardner: Steve, thin-client computing, always an interesting solution, but tying that to any device, any cloud — what do you see are some of the most interesting developments?

Greenberg: To me, it’s that push forward, and it’s the new CEO Kirill Tatarinov making a strong statement that we’re going to the cloud, as Rick says, taking it forward. But the most exciting thing for me, because day in and day out I architect and implement design, is to take this suite and to fit it to the organization. Every organization is different, and the best part of my job is going in and learning a new organization and what it is they do and how they do it. Inevitably, something Citrix is doing makes that better.

Now, as Rick said, we just have more options. If this organization needs cloud, it’s the best delivery model. Perhaps they’re distributed around the world or some other factor, and now they can do it. They have Citrix behind them casting the vision.

Learn How the Citrix Technology Professionals Program Helps
Shape the Future of Cloud Computing  

So it’s the flexibility, it’s the power and excitement that you get from moving at the speed of the business. It’s not IT saying no, not IT saying, “Well, I can’t do that new product line because our system is blah, blah, blah.” If we need to move quick, throw it in Azure. Let’s get on to that new offering.

Harder: Say “yes.”

Gardner: Jo, virtualization has never been as prominent as it is now. What do you see from the virtualization perspective with the new products and the new embrace of virtualization at multiple abstractions?

Tying in security

Harder: I’m looking at it from the banking sector, because that’s what I live and breathe. I’m looking at it from security, compliance, everything that comes along with the finance industry. I look at that probably a little bit more cautiously than most, but what I find pretty interesting is that Citrix is really tying in security end-to-end.

Some of the sessions here at Synergy have talked about the whole security piece. You want to be progressive, but you have to do it very securely. That’s one of the pieces that I’m really embracing from a virtualization standpoint.

From the standpoint of finance, there should be no data on the workstation. If somebody were to walk into a bank and steal that client device, they should not be able to walk off with any Social Security numbers, no non-public personal information (NPI), nothing of that sort. That’s what excites me about virtualization and tying that together, the way that Citrix has all the moving parts.

In the future, the next step for the banks is getting into wireless, getting into mobility. Citrix is very well-poised for that. So, the future is bright.

Gardner: So, security was the original big use case for VDI, nothing on the client. But now clients are everywhere. So it’s really, “How do we get the data from the edge and to the edge securely.”

Douglas, what are some of the key points from your perspective in terms of the Citrix product line and how that impacts users that you represent?

Brown: That’s a good question. I’m a XenApp baby. I see the cloud as the real, true information highway. It’s the enabler to allow us to bring things to market quicker. XenApp is that ultimate tool to then give access to the applications anywhere, any time.

I don’t care if it’s 2016, with all the stuff that we do today, or if it’s 1999, at the end of the day, I have never met an end user that comes into the company and says, “Gee, I can’t wait to use Windows 10,” or “Gee, I can’t wait to use that new Cisco Core Router they just bought.” They don’t come into work and say, “Oh no, I have to do a spreadsheet today.” They don’t even talk about Excel.

With all these different technologies we’re bringing around, be it the cloud, or mobility, or whatever, back to the user experience piece, Citrix is able to give the end user a better, faster time to market for them. At the end of the day, they’re able to work better from any place, any time.

I’ve been living a lot in Sarasota, but also I commute to Berlin, Germany. It’s sort of an interesting commute, but it doesn’t matter where I live, and this is the same story that we’ve said for 15 years.

It’s not about a new story; it’s just about bringing more components to make that, to fulfill that destiny of a better user experience. What’s IT there for? It’s to enable the users to do their jobs better, and ultimately, that’s what Citrix is about. Everything else is just fluff. Everything else is just the machinery.

Network intelligence

Gardner: Rick, when we think about changes in Citrix over the past couple of years — and there have been a lot of them — one of the things that strikes me is that they seem to be much more interested in strutting their stuff as to what their network intelligence capabilities are.

There’s a lot more discussion of NetScaler and how that integrates to mobility, security, big-data analytics, and cloud. Do you agree with me that the NetScaler and the intelligent networks component are more prominent, and how does that play into the future?

Dehlinger: NetScaler was, by anybody’s measure, one of the best acquisitions Citrix ever made. They got some fantastic technology and brilliant talent. Some of the things that we’ve been able to do with NetScaler in our tool bag, as we’re out solving problems and helping customers take things to the next level, is just mind-boggling.

I’m thrilled at the change. It seems like they finally started to figure out a better way to both communicate what NetScaler is and its role in this whole game. You asked me about the Microsoft-Citrix relationship a bit earlier. Some of the stuff that Citrix is doing now (in that partnership) to start incorporating and leveraging the NetScaler and its unique layer of visibility between the user and the applications – will enable some some really amazing new capabilities.

I think it’s fantastic that they finally found the language. NetScaler is starting to get its feet underneath it, although you could argue it already has its feet underneath it; it’s been a billion dollar-plus business for Citrix for a couple of years now.

Gardner: Jo, how about you in terms of security and in the banking sector in particular, intelligent network services, something really impressive; important or what?

Harder: Just to expand on what Rick said, I think what Citrix is doing with NetScaler is great. Some days, I feel like I don’t fully understand, and I’m immersed in these technologies, but then you learn something else that NetScaler can do for you. There is more, there is more, there is more. It’s in there, and it’s a matter of finding out exactly how to best use it, and then going ahead and using the products. With NetScaler, I totally agree with Rick; the sky is the limit on it.

Dehlinger: Well, NetScaler used to be the realm of the packet trace junkies. Load balancing is the easiest thing that people can use to describe what NetScaler does, but that whole world was just fraught with massive acronyms, crazy technology, terminology, standards, and stuff that (for the normal human being or the business person in particular) was just mind-boggling and baffling.

It’s great that Citrix is finally finding some language to be able to demystify a little bit of that, and show that underneath all that mysticism and the support for all these crazy new fancy TLAs and acronyms, here is some really amazing powerful business value there just waiting to be unlocked and leveraged.

Gardner: Steve, mobile work styles as opposed to mobility or device or bring your own device (BYOD) — how far do you feel that your community contacts have gotten in that direction of a mobility style change rather than simply doing something with a smaller device in more places?

Transforming organizations

Greenberg: That’s a great question, because I think this particular group has been at the core of this for some time, and we have taken some very notable large organizations and completely transformed them.

People work from home. People work on a multitude of devices. I can be sitting at the desktop in the office, grab a laptop and go jump in a cab, take my phone, and there is that seamless experience. We really are there. At this point, it’s just a matter of getting it more widely infiltrated, getting people aware of what they can do.

To this day, although it seems old to us, I still go into new client sites and opportunities and say, “You could do this,” and they say, “Really? I didn’t know I could do that.” It’s there, but now the society is catching up, if that makes sense.

Gardner: It also seems that some of the file-share demonstrations and announcements show the benefit of the whole greater than the sum of the parts, when you can integrate with cloud, with devices. Any thoughts about the power of an integrated file share rather than just the plain vanilla one-size-fits-all type of cloud-based file share?

Greenberg: That’s the final layer that makes this mobile work style a reality. Before, if you could remote in the XenApp style that Doug was referring to, you could get your job done. But now that you can transmit data securely, when it hits your phone, you’re working on it natively.

I go into the subway and the signal drops. Well, that file is there and I can edit it, sign it, get my signal back, and go. It has taken that virtualization mobility to a level now where it can travel and it can be seamless.

Gardner: And that’s an intelligent container. So, if your requirements around privacy or security mean that you have to have control over what that session is and does, you can get that.

Douglas, how important is that intelligent container when put in the context of an intelligent network?

Brown: Extremely important. It’s important from every aspect of the business. Nowadays, we’re able to do those things where we have never been able to in the past, at the level they are at now.

It can’t be understated how important those components are. It comes down to maturity. The technology and the vision have been there — or the vision has been there, and the technology is coming around. Now, with technologies such as that, it’s matured, and then we’re able to achieve all of our goals, from business, and to end users.

New capability

Greenberg: Citrix demonstrated at Synergy 2016 a new capability that wasn’t there before. We’re all familiar with the Dropbox model, where I can send a file, but once you send it, it’s out there in the wild. What they showed today was sending a file and then changing its status. So, even though that person had received the file and looked at it, when the status changed, they could no longer see it. That’s the home run. That’s the piece that was not part of this capability before.

Harder: I tweeted this morning that this new capability really propelled Citrix ShareFile into being the file-sharing solution for business. There are a lot of other solutions out there, but they’re really not suitable for business. They don’t provide that level of security and a signature signing that enables. Think about the security impacts of that, the legalities. They have it covered. There’s a lot more coming. Once some of the states start to add how the digital signature can be incorporated as the notarized signature, wow.

Gardner: Many business processes really do get that mobile style of work as a result, and rather than just repaving cow paths, you’re really doing something quite new and different.

Before we sign off, I would like to allow our listeners and readers to get more information on the Citrix Technology Professional Program. If they’re interested in learning more, maybe taking some role themselves, where should they go?

Dehlinger: Definitely start with the CTP page on the Citrix website. That’s a great place to find out more about this group and what they do. However, look at the Citrix User Group Communities out there. There are a lot of fantastic people present. We (CTP’s) are blessed by having the opportunity to be able to represent a big base, but in a lot of localities around the world, the Citrix User Group Communities have been doing some fantastic things, and making a difference locally.

Gardner: Sort of a federation of groups around the world.

Dehlinger: Absolutely.

Greenberg: I would add, blog, tweet, turn out for user groups, come out to Synergy, come out to Summit. If you’re one of the reseller partners, make yourself known.

We ‘re a community of almost-crazy enthusiasts. We have a ridiculous level of interest and passion. We have a tendency to find each other, and we’re always amazed to see new people come from a place, a country, or a business we never heard of with new solutions.

Learn How the Citrix Technology Professionals Program Helps
Shape the Future of Cloud Computing  

A great event happening today is the Geek Speak tonight. We have done a GeekOvation program, where people submit their projects and their work and come up and get recognized for it and have a little contest. There are endless possibilities. Just get out there and start communicating.

Dehlinger: Participate!

Harder: And have fun.

Brown: In a couple of weeks I’m going to be going to Norway with Rick for one of the best and oldest Citrix User Groups around the world, but that advocacy, is only halfway done, programs and other things for people looking to get into the CTP Program or just sharing knowledge in general.

Start up a blog, have some fun, share knowledge. I’ve always said, knowledge is not power; power is in dispersing that knowledge.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: Citrix.

You may also be interested in:

Posted in Citrix | Tagged , , , , , , , , , | Leave a comment