.banner-thumbnail-wrapper { display:none; }

informatics

Using AI to solve data and IT complexity -- and thereby better enable AI

Using AI to solve data and IT complexity -- and thereby better enable AI

Learn how AI will help conquer complexity to allow for higher abstractions of benefits from across all sorts of data for better analysis.

How real-time data streaming and integration set the stage for AI-driven DataOps

How real-time data streaming and integration set the stage for AI-driven DataOps

A discussion the latest strategies for uniting and governing data to enable rapid and actionable analysis in a multi-cloud world. 

How new tools help any business build ethical and sustainable supply chains

The next BriefingsDirect digital business innovations discussion explores new ways that companies gain improved visibility, analytics, and predictive responses to better manage supply-chain risk-and-reward sustainability factors.

We’ll examine new tools and methods that can be combined to ease the assessment and remediation of hundreds of supply-chain risks -- from use of illegal and unethical labor practices to hidden environmental malpractices

Listen to the podcastFind it on iTunes. Get the mobile app. Read a full transcript or download a copy. 

Here to explore more about the exploding sophistication in the ability to gain insights into supply-chain risks and provide rapid remediation, are our panelists, Tony Harris, Global Vice President and General Manager of Supplier Management Solutions at SAP Ariba; Erin McVeigh, Head of Products and Data Services at Verisk Maplecroft, and Emily Rakowski, Chief Marketing Officer at EcoVadis. The discussion was moderated by Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: Tony, I heard somebody say recently there’s never been a better time to gather information and to assert governance across supply chains. Why is that the case? Why is this an opportune time to be attacking risk in supply chains?

Harris: Several factors have culminated in a very short time around the need for organizations to have better governance and insight into their supply chains.

Harris

Harris

First, there is legislation such as the UK’s Modern Slavery Act in 2015 and variations of this across the world. This is forcing companies to make declarations that they are working to eradicate forced labor from their supply chains. Of course, they can state that they are not taking any action, but if you can imagine the impacts that such a statement would have on the reputation of the company, it’s not going to be very good. 

Next, there has been a real step change in the way the public now considers and evaluates the companies whose goods and services they are buying. People inherently want to do good in the world, and they want to buy products and services from companies who can demonstrate, in full transparency, that they are also making a positive contribution to society -- and not just generating dividends and capital growth for shareholders. 

Finally, there’s also been a step change by many innovative companies that have realized the real value of fully embracing an environmental, social, and governance (ESG) agenda. There’s clear evidence that now shows that companies with a solid ESG policy are more valuable. They sell more. The company’s valuation is higher. They attract and retain more top talent -- particularly Millennials and Generation Z -- and they are more likely to get better investment rates as well. 

Gardner: The impetus is clearly there for ethical examination of how you do business, and to let your costumers know that. But what about the technologies and methods that better accomplish this? Is there not, hand in hand, an opportunity to dig deeper and see deeper than you ever could before?

Better business decisions with AI

Harris: Yes, we have seen a big increase in the number of data and content companies that now provide insights into the different risk types that organizations face.

We have companies like EcoVadis that have built score cards on various corporate social responsibility (CSR) metrics, and Verisk Maplecroft’s indices across the whole range of ESG criteria. We have financial risk ratings, we have cyber risk ratings, and we have compliance risk ratings. 

These insights and these data providers are great. They really are the building blocks of risk management. However, what I think has been missing until recently was the capability to pull all of this together so that you can really get a single view of your entire supplier risk exposure across your business in one place.

What has been missing was the capability to pull all of this together so that you can really get a single view of your entire supplier risk exposure across your business.

Technologies such as artificial intelligence (AI), for example, and machine learning (ML) are supporting businesses at various stages of the procurement process in helping to make the right decisions. And that’s what we developed here at SAP Ariba. 

Gardner: It seems to me that 10 years ago when people talked about procurement and supply-chain integrity that they were really thinking about cost savings and process efficiency. Erin, what’s changed since then? And tell us also about Verisk Maplecroft and how you’re allowing a deeper set of variables to be examined when it comes to integrity across supply chains.

McVeigh: There’s been a lot of shift in the market in the last five to 10 years. I think that predominantly it really shifted with environmental regulatory compliance. Companies were being forced to look at issues that they never really had to dig underneath and understand -- not just their own footprint, but to understand their supply chain’s footprint. And then 10 years ago, of course, we had the California Transparency Act, and then from that we had the UK Modern Slavery Act, and we keep seeing more governance compliance requirements. 

McVeigh

McVeigh

But what’s really interesting is that companies are going beyond what’s mandated by regulations. The reason that they have to do that is because they don’t really know what’s coming next. With a global footprint, it changes that dynamic. So, they really need to think ahead of the game and make sure that they’re not reacting to new compliance initiatives. And they have to react to a different marketplace, as Tony explained; it’s a rapidly changing dynamic.

We were talking earlier today about the fact that companies are embracing sustainability, and they’re doing that because that’s what consumers are driving toward.

At Verisk Maplecroft, we came to business about 12 years ago, which was really interesting because it came out of a number of individuals who were getting their master’s degrees in supply-chain risk. They began to look at how to quantify risk issues that are so difficult and complex to understand and to make it simple, easy, and intuitive. 

They began with a subset of risk indices. I think probably initially we looked at 20 risks across the board. Now we’re up to more than 200 risk issues across four thematic issue categories. We begin at the highest pillar of thinking about risks -- like politics, economics, environmental, and social risks. But under each of those risk’s themes are specific issues that we look at. So, if we’re talking about social risk, we’re looking at diversity and labor, and then under each of those risk issues we go a step further, and it’s the indicators -- it’s all that data matrix that comes together that tell the actionable story. 

Some companies still just want to check a [compliance] box. Other companies want to dig deeper -- but the power is there for both kinds of companies. They have a very quick way to segment their supply chain, and for those that want to go to the next level to support their consumer demands, to support regulatory needs, they can have that data at their fingertips. 

Global compliance

Gardner: Emily, in this global environment you can’t just comply in one market or area. You need to be global in nature and thinking about all of the various markets and sustainability across them. Tell us what EcoVadis does and how an organization can be compliant on a global scale.

Rakowski: EcoVadis conducts business sustainability ratings, and the way that we’re using the procurement context is primarily that very large multinational companies like Johnson and Johnson or Nestlé will come to us and say, “We would like to evaluate the sustainability factors of our key suppliers.”

Rakowski

Rakowski

They might decide to evaluate only the suppliers that represent a significant risk to the business, or they might decide that they actually want to review all suppliers of a certain scale that represent a certain amount of spend in their business. 

What EcoVadis provides is a 10-year-old methodology for assessing businesses based on evidence-backed criteria. We put out a questionnaire to the supplier, what we call a right-sized questionnaire, the supplier responds to material questions based on what kind of goods or services they provide, what geography they are in, and what size of business they are in. 

Of course, very small suppliers are not expected to have very mature and sophisticated capabilities around sustainability systems, but larger suppliers are. So, we evaluate them based on those criteria, and then we collect all kinds of evidence from the suppliers in terms of their policies, their actions, and their results against those policies, and we give them ultimately a 0 to 100 score. 

And that 0 to 100 score is a pretty good indicator to the buying companies of how well that company is doing in their sustainability systems, and that includes such criteria as environmental, labor and human rights, their business practices, and sustainable procurement practices. 

Gardner: More data and information are being gathered on these risks on a global scale. But in order to make that information actionable, there’s an aggregation process under way. You’re aggregating on your own -- and SAP Ariba is now aggregating the aggregators.

How then do we make this actionable? What are the challenges, Tony, for making the great work being done by your partners into something that companies can really use and benefit from? 

Timely insights, best business decisions

Harris: Other than some of the technological challenges of aggregating this data across different providers is the need for linking it to the aspects of the procurement process in support of what our customers are trying to achieve. We must make sure that we can surface those insights at the right point in their process to help them make better decisions. 

The other aspect to this is how we’re looking at not just trying to support risk through that source-to-settlement process -- trying to surface those risk insights -- but also understanding that where there’s risk, there is opportunity.

So what we are looking at here is how can we help organizations to determine what value they can derive from turning a risk into an opportunity, and how they can then measure the value they’ve delivered in pursuit of that particular goal. These are a couple of the top challenges we’re working on right now.

We're looking at not just trying to support risk through that source-to-settlement process -- trying to surface those risk insights -- but also understanding that where there is risk there is opportunity.

Gardner: And what about the opportunity for compression of time? Not all challenges are something that are foreseeable. Is there something about this that allows companies to react very quickly? And how do you bring that into a procurement process?

Harris: If we look at some risk aspects such as natural disasters, you can’t react timelier than to a natural disaster. So, the way we can alert from our data sources on earthquakes, for example, we’re able to very quickly ascertain whom the suppliers are, where their distribution centers are, and where that supplier’s distribution centers and factories are.

When you can understand what the impacts are going to be very quickly, and how to respond to that, your mitigation plan is going to prevent the supply chain from coming to a complete halt. 

Gardner: We have to ask the obligatory question these days about AI and ML. What are the business implications for tapping into what’s now possible technically for better analyzing risks and even forecasting them? 

AI risk assessment reaps rewards

Harris: If you look at AI, this is a great technology, and what we trying to do is really simplify that process for our customers to figure out how they can take action on the information we’re providing. So rather them having to be experts in risk analysis and doing all this analysis themselves, AI allows us to surface those risks through the technology -- through our procurement suite, for example -- to impact the decisions they’re making. 

For example, if I’m in the process of awarding a piece of sourcing business off of a request for proposal (RFP), the technology can surface the risk insights against the supplier I’m about to award business to right at that point in time. 

A determination can be made based upon the goods or the services I’m looking to award to the supplier or based on the part of the world they operate in, or where I’m looking to distribute these goods or services. If a particular supplier has a risk issue that we feel is too high, we can act upon that. Now that might mean we postpone the award decision before we do some further investigation, or it may mean we choose not to award that business. So, AI can really help in those kinds of areas. 

Gardner: Emily, when we think about the pressing need for insight, we think about both data and analysis capabilities. This isn’t something necessarily that the buyer or an individual company can do alone if they don’t have access to the data. Why is your approach better and how does AI assist that?

Rakowski: In our case, it’s all about allowing for scale. The way that we’re applying AI and ML at EcoVadis is we’re using it to do an evidence-based evaluation.

We collect a great amount of documentation from the suppliers we’re evaluating, and actually that AI is helping us scan through the documentation more quickly. That way we can find the relevant information that our analysts are looking for, compress the evaluation time from what used to be about a six or seven-hour evaluation time for each supplier down to three or four hours. So that’s essentially allowing us to double our workforce of analysts in a heartbeat.

AI is helping us scan through the documentation more quickly. That way we can find the relevant information that our analysts are looking for, allowing us to double our workforce of analysts.

The other thing it’s doing is helping scan through material news feeds, so we’re collecting more than 2,500 news sources from around all kinds of reports, from China Labor Watch or OSHA. These technologies help us scan through those reports from material information, and then puts that in front of our analysts. It helps them then to surface that real-time news that we’re for sure at that point is material. 

And that way we we’re combining AI with real human analysis and validation to make sure that what we we’re serving is accurate and relevant. 

Harris: And that’s a great point, Emily. On the SAP Ariba side, we also use ML in analyzing similarly vast amounts of content from across the Internet. We’re scanning more than 600,000 data sources on a daily basis for information on any number of risk types. We’re scanning that content for more than 200 different risk types.

We use ML in that context to find an issue, or an article, for example, or a piece of bad news, bad media. The software effectively reads that article electronically. It understands that this is actually the supplier we think it is, the supplier that we’ve tracked, and it understands the context of that article. 

By effectively reading that text electronically, a machine has concluded, “Hey, this is about a contracts reduction, it may be the company just lost a piece of business and they had to downsize, and so that presents a potential risk to our business because maybe this supplier is on their way out of business.”

And the software using ML figures all that stuff out by itself. It defines a risk rating, a score, and brings that information to the attention of the appropriate category manager and various users. So, it is very powerful technology that can number crunch and read all this content very quickly. 

Gardner: Erin, at Maplecroft, how are such technologies as AI and ML being brought to bear, and what are the business benefits to your clients and your ecosystem? 

The AI-aggregation advantage

McVeigh: As an aggregator of data, it’s basically the bread and butter of what we do. We bring all of this information together and ML and AI allow us to do it faster, and more reliably

We look at many indices. We actually just revamped our social indices a couple of years ago.

Before that you had a human who was sitting there, maybe they were having a bad day and they just sort of checked the box. But now we have the capabilities to validate that data against true sources. 

Just as Emily mentioned, we were able to reduce our human-rights analyst team significantly and the number of individuals that it took to create an index and allow them to go out and begin to work on additional types of projects for our customers. This helped our customers to be able to utilize the data that’s being automated and generated for them. 

We also talked about what customers are expecting when they think about data these days. They’re thinking about the price of data coming down. They’re expecting it to be more dynamic, they’re expecting it to be more granular. And to be able to provide data at that level, it’s really the combination of technology with the intelligent data scientists, experts, and data engineers that bring that power together and allow companies to harness it. 

Gardner: Let’s get more concrete about how this goes to market. Tony, at the recent SAP Ariba Live conference, you announced the Ariba Supplier Risk improvements. Tell us about the productization of this, how people intercept with it. It sounds great in theory, but how does this actually work in practice?

Partnership prowess

Harris: What we announced at Ariba Live in March is the partnership between SAP Ariba, EcoVadis and Verisk Maplecroft to bring this combined set of ESG and CSR insights into SAP Ariba’s solution.

We do not yet have the solution generally available, so we are currently working on building out integration with our partners. We have a number of common customers that are working with us on what we call our design partners. There’s no better customer ultimately then a customer already using these solutions from our companies. We anticipate making this available in the Q3 2018 time frame. 

And with that, customers that have an active subscription to our combined solutions are then able to benefit from the integration, whereby we pull this data from Verisk Maplecroft, and we pull the CSR score cards, for example, from EcoVadis, and then we are able to present that within SAP Ariba’s supplier risk solution directly. 

What it means is that users can get that aggregated view, that high-level view across all of these different risk types and these metrics in one place. However, if, ultimately they are going to get to the nth degree of detail, they will have the ability to click through and naturally go into the solutions from our partners here as well, to drill right down to that level of detail. The aim here is to get them that high-level view to help them with their overall assessments of these suppliers. 

Gardner: Over time, is this something that organizations will be able to customize? They will have dials to tune in or out certain risks in order to make it more applicable to their particular situation?

Customers that have an active subscription to our combined solutions are then able to benefit from the integration and see all that data within SAP Ariba's supplier risk solutions directly.

Harris: Yes, and that’s a great question. We already addressed that in our solutions today. We cover risk across more than 200 types, and we categorized those into four primary risk categories. The way the risk exposure score works is that any of the feeding attributes that go into that calculation the customer gets to decide on how they want to weigh those. 

If I have more bias toward that kind of financial risk aspects, or if I have more of the bias toward ESG metrics, for example, then I can weigh that part of the score, the algorithm, appropriately.

Gardner: Before we close out, let’s examine the paybacks or penalties when you either do this well -- or not so well.

Erin, when an organization can fully avail themselves of the data, the insight, the analysis, make it actionable, make it low-latency -- how can that materially impact the company? Is this a nice-to-have, or how does it affect the bottom line? How do we make business value from this?

Nice-to-have ROI

Rakowski: One of the things that we’re still working on is quantifying the return on investment (ROI) for companies that are able to mitigate risk, because the event didn’t happen.

How do you put a tangible dollar value to something that didn’t occur? What we can look at is taking data that was acquired over the past few years and understand that as we begin to see our risk reduction over time, we begin to source for more suppliers, add diversity to our supply chain, or even minimize our supply chain depending on the way you want to move forward in your risk landscape and your supply diversification program. It’s giving them that power to really make those decisions faster and more actionable. 

And so, while many companies still think about data and tools around ethical sourcing or sustainable procurement as a nice-to-have, those leaders in the industry today are saying, “It’s no longer a nice-to-have, we’re actually changing the way we have done business for generations.”

And, it’s how other companies are beginning to see that it’s not being pushed down on them anymore from these large retailers, these large organizations. It’s a choice they have to make to do better business. They are also realizing that there’s a big ROI from putting in that upfront infrastructure and having dedicated resources that understand and utilize the data. They still need to internally create a strategy and make decisions about business process. 

We can automate through technology, we can provide data, and we can help to create technology that embeds their business process into it -- but ultimately it requires a company to embrace a culture, and a cultural shift to where they really believe that data is the foundation, and that technology will help them move in this direction.

Gardner: Emily, for companies that don’t have that culture, that don’t think seriously about what’s going on with their suppliers, what are some of the pitfalls? When you don’t take this seriously, are bad things going to happen? 

Pay attention, be prepared

Rakowski: There are dozens and dozens of stories out there about companies that have not paid attention to critical ESG aspects and suffered the consequences of a horrible brand hit or a fine from a regulatory situation. And any of those things easily cost that company on the order of a hundred times what it would cost to actually put in place a program and some supporting services and technologies to try to avoid that. 

From an ROI standpoint, there’s a lot of evidence out there in terms of these stories. For companies that are not really as sophisticated or ready to embrace sustainable procurement, it is a challenge. Hopefully there are some positive mavericks out there in the businesses that are willing to stake their reputation on trying to move in this direction, understanding that the power they have in the procurement function is great. 

They can use their company’s resources to bet on supply-chain actors that are doing the right thing, that are paying living wages, that are not overworking their employees, that are not dumping toxic chemicals in our rivers and these are all things that, I think, everybody is coming to realize are really a must, regardless of regulations.

Hopefully there are some positive mavericks out there who are willing to stake their reputations on moving in this direction. The power they have in the procurement function is great.

And so, it’s really those individuals that are willing to stand up, take a stand and think about how they are going to put in place a program that will really drive this culture into the business, and educate the business. Even if you’re starting from a very little group that’s dedicated to it, you can find a way to make it grow within a culture. I think it’s critical.

Gardner: Tony, for organizations interested in taking advantage of these technologies and capabilities, what should they be doing to prepare to best use them? What should companies be thinking about as they get ready for such great tools that are coming their way?

Synergistic risk management

Harris: Organizationally, there tend to be a couple of different teams inside of business that manage risks. So, on the one hand there can be the kind of governance risk and compliance team. On the other hand, they can be the corporate social responsibility team. 

I think first of all, bringing those two teams together in some capacity makes complete sense because there are synergies across those teams. They are both ultimately trying to achieve the same outcome for the business, right? Safeguard the business against unforeseen risks, but also ensure that the business is doing the right thing in the first place, which can help safeguard the business from unforeseen risks.

I think getting the organizational model right, and also thinking about how they can best begin to map out their supply chains are key. One of the big challenges here, which we haven’t quite solved yet, is figuring out who are the players or supply-chain actors in that supply chain? It’s pretty easy to determine now who are the tier-one suppliers, but who are the suppliers to the suppliers -- and who are the suppliers to the suppliers to the suppliers?

We’ve yet to actually build a better technology that can figure that out easily. We’re working on it; stay posted. But I think trying to compile that information upfront is great because once you can get that mapping done, our software and our partner software with EcoVadis and Verisk Maplecroft is here to surfaces those kinds of risks inside and across that entire supply chain.

Listen to the podcastFind it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: SAP Ariba.

You may also be interested in:

How HudsonAlpha transforms hybrid cloud complexity into an IT force multiplier

The next BriefingsDirect hybrid IT management success story examines how the nonprofit research institute HudsonAlpha improves how it harnesses and leverages a spectrum of IT deployment environments.

We’ll now learn how HudsonAlpha has been testing a new Hewlett Packard Enterprise (HPE) solution, OneSphere, to gain a common and simplified management interface to rule them all.

Listen to the podcastFind it on iTunes. Get the mobile app. Read a full transcript or download a copy.

Here to help explore the benefits of improved levels of multi-cloud visibility and process automation is Katreena Mullican, Senior Architect and Cloud Whisperer at HudsonAlpha Institute for Biotechnology in Huntsville, Alabama. The discussion is moderated by Dana Gardner, principal analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: What’s driving the need to solve hybrid IT complexity at HudsonAlpha?

Mullican: The big drivers at HudsonAlpha are the requirements for data locality and ease-of-adoption. We produce about 6 petabytes of new data every year, and that rate is increasing with every project that we do.

Mullican

Mullican

We support hundreds of research programs with data and trend analysis. Our infrastructure requires quickly iterating to identify the approaches that are both cost-effective and the best fit for the needs of our users.

Gardner: Do you find that having multiple types of IT platforms, environments, and architectures creates a level of complexity that’s increasingly difficult to manage?

Mullican: Gaining a competitive edge requires adopting new approaches to hybrid IT. Even carefully contained shadow IT is a great way to develop new approaches and attain breakthroughs.

Gardner: You want to give people enough leash where they can go and roam and experiment, but perhaps not so much that you don’t know where they are, what they are doing.

Software-defined everything 

Mullican: Right. “Software-defined everything” is our mantra. That’s what we aim to do at HudsonAlpha for gaining rapid innovation.

Gardner: How do you gain balance from too hard-to-manage complexity, with a potential of chaos, to the point where you can harness and optimize -- yet allow for experimentation, too?

Mullican: IT is ultimately responsible for the security and the up-time of the infrastructure. So it’s important to have a good framework on which the developers and the researchers can compute. It’s about finding a balance between letting them have provisioning access to those resources versus being able to keep an eye on what they are doing. And not only from a usage perspective, but from a cost perspective, too.

Simplified 

Hybrid Cloud

Management

Gardner: Tell us about HudsonAlpha and its fairly extreme IT requirements.

Mullican: HudsonAlpha is a nonprofit organization of entrepreneurs, scientists, and educators who apply the benefits of genomics to everyday life. We also provide IT services and support for about 40 affiliate companies on our 150-acre campus in Huntsville, Alabama.

Gardner: What about the IT requirements? How you fulfill that mandate using technology?

Mullican: We produce 6 petabytes of new data every year. We have millions of hours of compute processing time running on our infrastructure. We have hardware acceleration. We have direct connections to clouds. We have collaboration for our researchers that extends throughout the world to external organizations. We use containers, and we use multiple cloud providers. 

Gardner: So you have been doing multi-cloud before there was even a word for multi-cloud?

Mullican: We are the hybrid-scale and hybrid IT organization that no one has ever heard of.

Gardner: Let’s unpack some of the hurdles you need to overcome to keep all of your scientists and researchers happy. How do you avoid lock-in? How do you keep it so that you can remain open and competitive?

Agnostic arrangements of clouds

Mullican: It’s important for us to keep our local datacenters agnostic, as well as our private and public clouds. So we strive to communicate with all of our resources through application programming interfaces (APIs), and we use open-source technologies at HudsonAlpha. We are proud of that. Yet there are a lot of possibilities for arranging all of those pieces.

There are a lot [of services] that you can combine with the right toolsets, not only in your local datacenter but also in the clouds. If you put in the effort to write the code with that in mind -- so you don’t lock into any one solution necessarily -- then you can optimize and put everything together.

Gardner: Because you are a nonprofit institute, you often seek grants. But those grants can come with unique requirements, even IT use benefits and cloud choice considerations.

Cloud cost control, granted

Mullican: Right. Researchers are applying for grants throughout the year, and now with the National Institutes of Health (NIH), when grants are awarded, they come with community cloud credits, which is an exciting idea for the researchers. It means they can immediately begin consuming resources in the cloud -- from storage to compute -- and that cost is covered by the grant.

So they are anxious to get started on that, which brings challenges to IT. We certainly don’t want to be the holdup for that innovation. We want the projects to progress as rapidly as possible. At the same time, we need to be aware of what is happening in a cloud and not lose control over usage and cost.

Simplified 

Hybrid Cloud

Management

Gardner: Certainly HudsonAlpha is an extreme test bed for multi-cloud management, with lots of different systems, changing requirements, and the need to provide the flexibility to innovate to your clientele. When you wanted a better management capability, to gain an overview into that full hybrid IT environment, how did you come together with HPE and test what they are doing?

Variety is the spice of IT

Mullican: We’ve invested in composable infrastructure and hyperconverged infrastructure (HCI) in our datacenter, as well as blade server technology. We have a wide variety of compute, networking, and storage resources available to us.

The key is: How do we rapidly provision those resources in an automated fashion? I think the key there is not only for IT to be aware of those resources, but for developers to be as well. We have groups of developers dealing with bioinformatics at HudsonAlpha. They can benefit from all of the different types of infrastructure in our datacenter. What HPE OneSphere does is enable them to access -- through a common API -- that infrastructure. So it’s very exciting.

Gardner: What did HPE OneSphere bring to the table for you in order to be able to rationalize, visualize, and even prioritize this very large mixture of hybrid IT assets?

Mullican: We have been beta testing HPE OneSphere since October 2017, and we have tied it into our VMware ESX Server environment, as well as our Amazon Web Services (AWS) environment successfully -- and that’s at an IT level. So our next step is to give that to researchers as a single pane of glass where they can go and provision the resources themselves.

Gardner: What this might capability bring to you and your organization?

Cross-training the clouds

Mullican: We want to do more with cross-cloud. Right now we are very adept at provisioning within our datacenters, provisioning within each individual cloud. HudsonAlpha has a presence in all the major public clouds -- AWSGoogleMicrosoft Azure. But the next step would be to go cross-cloud, to provision applications across them all.

For example, you might have an application that runs as a series of microservices. So you can have one microservice take advantage of your on-premises datacenter, such as for local storage. And then another piece could take advantage of object storage in the cloud. And even another piece could be in another separate public cloud.

But the key here is that our developer and researchers -- the end users of OneSphere – they don’t need to know all of the specifics of provisioning in each of those environments. That is not a level of expertise in their wheelhouse. In this new OneSphere way, all they know is that they are provisioning the application in the pipeline -- and that’s what the researchers will use. Then it’s up to us in IT to come along and keep an eye on what they are doing through the analytics that HPE OneSphere provides.

Gardner: Because OneSphere gives you the visibility to see what the end users are doing, potentially, for cost optimization and remaining competitive, you may be able to play one cloud off another. You may even be able to automate and orchestrate that.

Simplified 

Hybrid Cloud

Management

Mullican: Right, and that will be an ongoing effort to always optimize cost -- but not at the risk of slowing the research. We want the research to happen, and to innovate as quickly as possible. We don’t want to be the holdup for that. But we definitely do need to loop back around and keep an eye on how the different clouds are being used and make decisions going forward based on the analytics.

Gardner: There may be other organizations that are going to be more cost-focused, and they will probably want to dial back to get the best deals. It’s nice that we have the flexibility to choose an algorithmic approach to business, if you will.

Mullican: Right. The research that we do at HudsonAlpha saves lives and the utmost importance is to be able to conduct that research at the fastest speed.

Gardner: HPE OneSphere seems geared toward being cloud-agnostic. They are beginning on AWS, yet they are going to be adding more clouds. And they are supporting more internal private cloud infrastructures, and using an API-driven approach to microservices and containers.

The research that we do at HudsonAlpha saves lives, and the utmost importance is to be able to conduct the research at the fastest speed.

As an early tester, and someone who has been a long-time user of HPE infrastructure, is there anything about the combination of HPE SynergyHPE SimpliVity HCI, and HPE 3PAR intelligent storage -- in conjunction with OneSphere -- that’s given you a "whole greater than the sum of the parts" effect?

Mullican: HPE Synergy and composable infrastructure is something that is very near and dear to me. I have a lot of hours invested with HPE Synergy Image Streamer and customizing open-source applications on Image Streamer -– open-source operating systems and applications.

The ability to utilize that in the mix that I have architected natively with OneSphere -- in addition to the public clouds -- is very powerful, and I am excited to see where that goes.

Gardner: Any words of wisdom to others who may be have not yet gone down this road? What do you advise others to consider as they are seeking to better compose, automate, and optimize their infrastructure? 

Get adept at DevOps

Mullican: It needs to start with IT. IT needs to take on more of a DevOps approach.

As far as putting an emphasis on automation -- and being able to provision infrastructure in the datacenter and the cloud through automated APIs -- a lot of companies probably are still slow to adopt that. They are still provisioning in older methods, and I think it’s important that they do that. But then, once your IT department is adept with DevOps, your developers can begin feeding from that and using what IT has laid down as a foundation. So it needs to start with IT.

It involves a skill set change for some of the traditional system administrators and network administrators. But now, with software-defined networking (SDN) and with automated deployments and provisioning of resources -- that’s a skill set that IT really needs to step up and master. That’s because they are going to need to set the example for the developers who are going to come along and be able to then use those same tools.

That’s the partnership that companies really need to foster -- and it’s between IT and developers. And something like HPE OneSphere is a good fit for that, because it provides a unified API.

On one hand, your IT department can be busy mastering how to communicate with their infrastructure through that tool. And at the same time, they can be refactoring applications as microservices, and that’s up to the developer teams. So both can be working on all of this at the same time.

Then when it all comes together with a service catalog of options, in the end it’s just a simple interface. That’s what we want, to provide a simple interface for the researchers. They don’t have to think about all the work that went into the infrastructure, they are just choosing the proper workflow and pipeline for future projects.

We want to provide a simple interface to the researchers. They don't have to think about all the work that went into the infrastructure.

Gardner: It also sounds, Katreena, like you are able to elevate IT to a solutions-level abstraction, and that OneSphere is an accelerant to elevating IT. At the same time, OneSphere is an accelerant to the adoption of DevOps, which means it’s also elevating the developers. So are we really finally bringing people to that higher plane of business-focus and digital transformation?

HCI advances across the globe

Mullican: Yes. HPE OneSphere is an advantage to both of those departments, which in some companies can be still quite disparate. Now at HudsonAlpha, we are DevOps in IT. It’s not a distinguished department, but in some companies that’s not the case.

And I think we have a lot of advantages because we think in terms of automation, and we think in terms of APIs from the infrastructure standpoint. And the tools that we have invested in, the types of composable and hyperconverged infrastructure, are helping accomplish that.

Gardner: I speak with a number of organizations that are global, and they have some data sovereignty concerns. I’d like to explore, before we close out, how OneSphere also might be powerful in helping to decide where data sets reside in different clouds, private and public, for various regulatory reasons.

Is there something about having that visibility into hybrid IT that extends into hybrid data environments?

Mullican: Data locality is one of our driving factors in IT, and we do have on-premises storage as well as cloud storage. There is a time and a place for both of those, and they do not always mix, but we have requirements for our data to be available worldwide for collaboration.

So, the services that HPE OneSphere makes available are designed to use the appropriate data connections, whether that would be back to your object storage on-premises, or AWS Simple Storage Service (S3), for example, in the cloud.

Simplified 

Hybrid Cloud

Management

Gardner: Now we can think of HPE OneSphere as also elevating data scientists -- and even the people in charge of governance, risk management, and compliance (GRC) around adhering to regulations. It seems like it’s a gift that keeps giving.

Hybrid hard work pays off

Mullican: It is a good fit for hybrid IT and what we do at HudsonAlpha. It’s a natural addition to all of the preparation work that we have done in IT around automated provisioning with HPE Synergy and Image Streamer.

HPE OneSphere is a way to showcase to the end user all of the efforts that have been, and are being, done by IT. That’s why it’s a satisfying tool to implement, because, in the end, you want what you have worked on so hard to be available to the researchers and be put to use easily and quickly.

Listen to the podcastFind it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: Hewlett Packard Enterprise.

You may also be interested in:

Inside story on HPC's role in the Bridges Research Project at Pittsburgh Supercomputing Center

The next BriefingsDirect Voice of the Customer high-performance computing (HPC) success story interview examines how Pittsburgh Supercomputing Center (PSC) has developed a research computing capability, Bridges, and how that's providing new levels of analytics, insights, and efficiencies.

We'll now learn how advances in IT infrastructure and memory-driven architectures are combining to meet the new requirements for artificial intelligence (AI), big data analytics, and deep machine learning.

Philips teams with HPE on ecosystem approach to improve healthcare informatics-driven outcomes

The next BriefingsDirect healthcare transformation use-case discussion focuses on how an ecosystem approach to big data solutions brings about improved healthcare informatics-driven outcomes.

We'll now learn how a Philips Healthcare Informatics and Hewlett Packard Enterprise (HPE) partnership creates new solutions for the global healthcare market and provides better health outcomes for patients by managing data and intelligence better.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript ordownload a copy.

Joining us to explain how companies tackle the complexity of solutions delivery in healthcare by using advanced big data and analytics is Martijn Heemskerk, Healthcare Informatics Ecosystem Director for Philips, based in Eindhoven, the Netherlands. The discussion is moderated by Dana Gardner, Principal Analyst at Interarbor Solutions.


Here are some excerpts:

Gardner: Why are partnerships so important in healthcare informatics? Is it because there are clinical considerations combined with big data technology? Why are these types of solutions particularly dependent upon an ecosystem approach?

Heemskerk: It’s exactly as you say, Dana. At Philips we are very strong at developing clinical solutions for our customers. But nowadays those solutions also require an IT infrastructure layer

Heemskerk

Heemskerk

underneath to solve the total equation. As such, we are looking for partners in the ecosystem because we at Philips recognize that we cannot do everything alone. We need partners in the ecosystem that can help address the total solution -- or the total value proposition -- for our customers.

Gardner: I'm sure it varies from region to region, but is there a cultural barrier in some regard to bringing cutting-edge IT in particular into healthcare organizations? Or have things progressed to where technology and healthcare converge?

Heemskerk: Of course, there are some countries that are more mature than others. Therefore the level of healthcare and the type of solutions that you offer to different countries may vary. But in principle, many of the challenges that hospitals everywhere are going through are similar.

Some of the not-so-mature markets are also trying to leapfrog so that they can deliver different solutions that are up to par with the mature markets.

Gardner: Because we are hearing a lot about big data and edge computing these days, we are seeing the need for analytics at a distributed architecture scale. Please explain how big data changes healthcare.

Big data value add

Heemskerk: What is very interesting for big data is what happens if you combine it with value-based care. It's a very interesting topic. For example, nowadays, a hospital is not reimbursed for every procedure that it does in the hospital – the value is based more on the total outcome of how a patient recovers.

This means that more analytics need to be gathered across different elements of the process chain before reimbursement will take place. In that sense, analytics become very important for hospitals on how to measure on how things are being done efficiently, and determining if the costs are okay.

Gardner: The same data that can used to be more efficient can also be used for better healthcare outcomes and understanding the path of the disease, or for the efficacy of procedures, and so on. A great deal can be gained when data is gathered and used properly.

Heemskerk: That is correct. And you see, indeed, that there is much more data nowadays, and you can utilize it for all kind of different things.

Learn About HPE

Digital Solutions

That Drive Healthcare and Life Sciences

Gardner: Please help us understand the relationship between your organization and HPE. Where does your part of the value begin and end, and how does HPE fill their role on the technology side?

Healthy hardware relationships 

Heemskerk: HPE has been a highly valued supplier of Philips for quite a long time. We use their technologies for all kinds of different clinical solutions. For example, all of the hardware that we use for our back-end solutions or for advanced visualization is sourced by HPE. I am focusing very much on the commercial side of the game, so to speak, where we are really looking at how can we jointly go to market.

As I said, customers are really looking for one-stop shopping, a complete value proposition, for the challenges that they are facing. That’s why we partner with HPE on a holistic level.

Gardner: Does that involve bringing HPE into certain accounts and vice versa, and then going in to provide larger solutions together?

Heemskerk: Yes, that is exactly the case, indeed. We recognized that we are not so much focusing on problems related to just the clinical implications, and we are not just focusing on the problems that HPE is facing -- the IT infrastructure and the connectivity side of the value chain. Instead, we are really looking at the problems that the C-suite-level healthcare executives are facing.

You can think about healthcare industry consolidation, for example, as a big topic. Many hospitals are now moving into a cluster or into a network and that creates all kinds of challenges, both on the clinical application layer, but also on the IT infrastructure. How do you harmonize all of this? How do you standardize all of your different applications? How do you make sure that hospitals are going to be connected? How do you align all of your processes so that there is a more optimized process flow within the hospitals?

By addressing these kinds of questions and jointly going to our customers with HPE, we can improve user experiences for the customers, we can create better services, we have optimized these solutions, and then we can deliver a lot of time savings for the hospitals as well.

Learn About HPE

Digital Solutions

That Drive Healthcare and Life Sciences

Gardner: We have certainly seen in other industries that if you try IT modernization without including the larger organization -- the people, the process, and the culture -- the results just aren’t as good. It is important to go at modernization and transformation, consolidation of data centers, for example, with that full range of inputs and getting full buy-in.

Who else makes up the ecosystem? It takes more than two players to make an ecosystem.

Heemskerk: Yes, that's very true, indeed. In this, system integrators also have a very important role. They can have an independent view on what would be the best solution to fit a specific hospital.

Of course, we think that the Philips healthcare solutions are quite often the best, jointly focused with the solutions from HPE, but from time to time you can be partnering with different vendors.

Besides that, we don't have all of the clinical applications. By partnering with other vendors in the ecosystem, sometimes you can enhance the solutions that we have to think about; such as 3D solutions and 3D printing solutions.

Gardner: When you do this all correctly, when you leverage and exploit an ecosystem approach, when you cover the bases of technology, finance, culture, and clinical considerations, how much of an impressive improvement can we typically see?

Saving time, money, and people

Heemskerk: We try to look at it customer by customer, but generically what we see is that there are really a lot of savings.

First of all, addressing standardization across the clinical application layer means that a customer doesn't have to spend a lot of money on training all of its hospital employees on different kinds of solutions. So that's already a big savings.

Secondly, by harmonizing and making better effective use of the clinical applications, you can drive the total cost of ownership down.

Thirdly, it means that on the clinical applications layer, there are a lot of efficiency benefits possible. For example, advanced analytics make it possible to reduce the time that clinicians or radiologists are spending on analyzing different kinds of elements, which also creates time savings.

Gardner: Looking more to the future, as technologies improve, as costs go down, as they typically do, as hybrid IT models are utilized and understood better -- where do you see things going next for the healthcare sector when it comes to utilizing technology, utilizing informatics, and improving their overall process and outcomes?

Learn About HPE

Digital Solutions

That Drive Healthcare and Life Sciences

Heemskerk: What for me would be very interesting is to see is if we can create some kind of a patient-centric data file for each patient. You see that consumers are increasingly engaged in their own health, with all the different devices like Fitbit, Jawbone, Apple Watch, etc. coming up. This is creating a massive amount of data. But there is much more data that you can put into such a patient-centric file, with the chronic diseases information now that people are being monitored much more, and much more often.

If you can have a chronological view of all of the different touch points that the patient has in the hospital, combined with the drugs that the patient is using etc., and you have that all in this patient-centric file -- it will be very interesting. And everything, of course, needs to be interconnected. Therefore, Internet of Things (IoT) technologies will become more important. And as the data is growing, you will have smarter algorithms that can also interpret that data – and so artificial intelligence (AI) will become much more important.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript ordownload a copy. Sponsor: Hewlett Packard Enterprise.

You may also be interested in:

·       How IoT capabilities open new doors for Miami Telecoms Platform Provider Identidad

·       DreamWorks Animation crafts its next era of dynamic IT infrastructure

·       How Enterprises Can Take the Ecosystem Path to Making the Most of Microsoft Azure Stack Apps

·       Hybrid Cloud ecosystem readies for impact from Microsoft Azure Stack

·       Converged IoT systems: Bringing the data center to the edge of everything

·       IDOL-powered appliance delivers better decisions via comprehensive business information searches

·        OCSL sets its sights on the Nirvana of hybrid IT—attaining the right mix of hybrid cloud for its clients

·       Fast acquisition of diverse unstructured data sources makes IDOL API tools a star at LogitBot

·       How lastminute.com uses machine learning to improve travel bookings user experience

·       HPE takes aim at customer needs for speed and agility in age of IoT, hybrid everything