How SAP Ariba became a first-mover as Blockchain comes to B2B

The next BriefingsDirect digital business thought leadership panel discussion examines the major opportunity from bringing Blockchain technology to business-to-business (B2B) procurement and supply chain management.

We will now explore how Blockchain’s unique capabilities can provide comprehensive visibility across global supply chains and drive simpler verification of authenticity, security, and ultimately control.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

To learn more about how Blockchain is poised to impact and improve supply chain risk and management, we're joined by Joe Fox, Senior Vice President for Business Development and Strategy at SAP Ariba, and Leanne Kemp, Founder and CEO of Everledger, based in London. The panel was assembled and recorded at the recent 2017 SAP Ariba LIVE conference in Las Vegas. The discussion is moderated by Dana Gardner, principal analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: Joe, Blockchain has emerged as a network methodology, running crypto currency Bitcoin, as most people are aware of it. It's a digitally shared record of transactions maintained by a network of computers, not necessarily with centralized authority. What could this be used for powerfully when it comes to gaining supply chain integrity?

Fox: Blockchain did start in the Bitcoin area, as peer-to-peer consumer functionality. But a lot of the capabilities of Blockchain have been recognized as important for new areas of innovation in the enterprise software space.

Fox

Fox

Those areas of innovation are around “trusted commerce.” Trusted commerce allows buyers and sellers, and third parties, to gain more visibility into asset-tracking. Not just asset tracking in the context of the buyer receiving and the seller shipping -- but in the context of where is the good in transit? What do I need to do to protect that good? What is the transfer of funds associated with that important asset? There are even areas of other applications, such as an insurance aspect or some kind of ownership-proof.

Gardner: It sounds to me like we are adding lot of metadata to a business process. What's different when you apply that through Blockchain than if you were doing it through a platform?

Inherit the trust

Fox: That's a great question. Blockchain is like the cloud from the perspective of it’s an innovation at the platform layer. But the chain is only as valuable as the external trust that it inherits. That external trust that it inherits is the proof of what you have put on the chain digitally. And that includes that proof of who has taken it off and in what way they have control.

As we associate a chain transaction, or a posting to the ledger with its original transactions within the SAP Ariba Network, we are actually adding a lot of prominence to that single Blockchain record. That's the real key, marrying the transactional world and the B2B world with this new trusted commerce capability that comes with Blockchain.

Gardner: Leanne, we have you here as a prime example of where Blockchain is being used outside of its original adoption. Tell us first about Everledger, and then what it was you saw in Blockchain that made you think it was applicable to a much wider businesscapability.

Kemp: Everledger is a fast-moving startup using the best of emerging technology to assist in the reduction of risk and fraud. We began in April of 2015, so it's actually our birthday this week. We started in the world of diamonds where we apply blockchain technology to bring transparency to a once opaque market.

Kemp

Kemp

And what did I see in the technology? At the very core of cryptocurrency, they were solving the problem of double-spend. They were solving the problem of transfer of value, and we could translate those very two powerful concepts into the diamond industry.

At the heart of the diamond industry, beyond the physical object itself, is certification, and certificates in the diamond industry are the currency of trade. Diamonds are cited on web sites around the world, and they are mostly sold off the merit of the certification. We were able to see the potential of the cryptocurrency, but we could decouple the currency from the ledger and we were able to then use the synthesis of the currency as a way to transfer value, or transfer ownership or custody. And, of course, diamonds are a girl's best friend, so we might as well start there.

Dealing with diamonds

Gardner: What was the problem in the diamond industry that you were solving? What was not possible that now is?

Kemp: The diamond industry boasts some pretty impressive numbers. First, it's been around for 130 years. Most of the relationships among buyers and sellers have survived generation upon generation based on a gentleman's handshake and trust.

The industry itself has been bound tightly with those relationships. As time has passed and generations have passed, what we are starting to see is a glacial melt. Some of the major players have sold off entities into other regions, and now that gentleman's handshake needs to be transposed into an electronic form.

Some of the major players in the market, of course, still reside today. But most of the data under their control sits in a siloed environment. Even the machines that are on the pipeline that help provide identity to the physical object are also black-boxed in terms of data.

We are able to bring a business network to an existing market. It's global. Some 81 countries around the world trade in rough diamonds. And, of course, the value of the diamonds increases as they pass through their evolutionary chain. We are able to bring an aggregated set of data. Not only that, we transpose the human element of trust -- the gentleman's handshake, the chit of paper and the promise to pay that's largely existed and has built has built 130 years of trade.

We are now able to transpose that into a set of electronic-form technologies -- Blockchain, smart contracts, cryptography, machine vision -- and we are able to take forward a technology platform that will see transactional trust being embedded well beyond my lifetime -- for generations to come.

Gardner: Joe, we have just heard how this is a problem-solution value in the diamond industry. But SAP Ariba has its eyes on many industries. What is it about the way things are done now in general business that isn't good enough but that Blockchain can help improve?

Fox: As we have spent years at Ariba solving procurement problems, we identified some of the toughest. When I saw Everledger, it occurred to me that they may have cracked the nut on one of the toughest areas of B2B trade -- and that is true understanding, visibility, and control of asset movement.

It dawned on me, too, that if you can track and trace diamonds, you can track and trace anything. I really felt like we could team up with this young company and leverage the unique way they figured out how to track and trace diamonds and apply that across a huge procurement problem. And that is, how do a supplier and a buyer manage the movement of any asset after they have purchased it? How do we actually associate that movement of the asset back to its original transactions that approved the commit-to-pay? How do you associate a digital purchase order (PO) with a digital movement of the asset, and then to the actual physical asset? That's what we really are teaming up to do.

That receipt of the asset has been a dark space in the B2B world for a long time. Sure, you can get a shipping notice, but most businesses don't do goods receipts. And as the asset flows through the supply chain -- especially the more expensive the item is -- that lack of visibility and control causes significant problems. Maybe the most important one is: overpaying for inventory to cover actual lost supply chain items in transit.

I talked to a really large UK-based telecom company and they told me that what we are going to do with Everledger, with just their fiber optics, they could cut their buying in half. Why? Because they overbuy their fiber optics to make sure they are never short on fiber optic inventory.

That precision of buying and delivery applies across the board to all merchants and all supply chains, even middle of the supply chain manufacturers. Whenever you have disruption to your inbound supply, that’s going to disrupt your profitability.

Gardner: It sounds as if what we are really doing here is getting a highly capable means -- that’s highly extensible -- to remove the margin of error from the tracking of goods, from cradle to grave.

Chain transactions

Fox: That’s exactly right. And the Internet is the enabler, because Blockchain is everywhere. Now, as the asset moves, you have the really cool stuff that Everledger has done, and other things we are going to do together – and that’s going to allow anybody from anywhere to post to the chain the asset receipt and asset movement.

For example, with a large container coming from overseas, you will have the chain record of every place that container has been. If it doesn't show up at a dock, you now have visibility as the buyer that there is a supply chain disruption. That chain being out on the Internet, at a layer that’s accessible by everyone, is one of the keys to this technology.

We are going to be focusing on connecting the fabric of the chain together with Hyperledger. Everledger builds on the Hyperledger platform. The fabric that we are going to tie into is going to directly connect those block posts back to the original transactions, like the purchase order, the invoice, the ship notice. Then the companies can see not only where their asset is, but also view it in context of the transactions that resulted in the shipment.

Gardner: So the old adage -- trust but verify -- we can now put that to work and truly verify. There's newstaking place here at SAP Ariba LIVE between Everledger and SAP Ariba. Tell us about that, and how the two companies -- one quite small, one very large -- are going to work together.

Fox: Ariba is all-in on transforming the procurement industry, the procurement space, the processes of procurement for our customers, buyers and sellers, and we are going to partner heavily with key players like Everledger.

Part of the announcement is this partnership with Everledger around track and trace, but it is not limited to track and trace. We will leverage what they have learned across our platform of $1 trillion a year in spend, with 2.5 million companies trading assets with each other. We are going to apply this partnership to many other capabilities within that.

Kemp: I am very excited. It’s a moment in time that I think I will remember for years to come. In March we also made an importantannouncement with IBM on some of the work that we have done beyond identifying objects. And that is to take the next step around ensuring that we have an ethical trade platform, meaning one that is grounded in cognitive compliance.

We will be able to identify the asset, but also know, for example in the diamond industry, that a diamond has passed through the right channels, paid the dutiful taxes that are due as a part of an international trade platform, and ensure all compliance is hardened within the chain.

I am hugely excited about the opportunity that sits before me. I am sincerely grateful that such a young company has been afforded the opportunity to really show how we are going to shine.

Gardner: When it comes to open trade, removing friction from commerce, these have been goals for hundreds of years. But we really seem to be onto something that can make this highly scalable, very rich -- almost an unlimited amount of data applied to any asset, connected to a ledger that’s a fluid, movable, yet tangible resource.

Fox: That’s right.

Gardner: So where do we go next, Joe? If the sky is the limit, describe the sky for me? How big is this, and where can you take it beyond individual industries? It sounds like there is more potential here.

Reduced friction costs

Fox: There is a lot of potential. If you think about it, Blockchain is an evolution of the Internet; we are going to be able to take advantage of that.

The new evolution is that it's a structured capability across the Internet itself. It’s going to be open, and it’s going to be able to allow companies to ledger their interactions with each other. They are going to be able, in an immutable way, to track who owns which asset, where the assets are, and be able to then use that as an audit capability.

That's all very important to businesses, and until now the Internet itself has not really had a structure for business. It's been open, the Wild West. This structure for business is going to help with what I call trusted commerce because in the end businesses establish relationships because they want to do business with each other, not based on what technology they have.

Another key fact about Blockchain is that it’s going to reduce friction in global B2B. I always like to say if you just accelerated B2B payments by a few days globally, you would open up Gross Domestic Product (GDP), and economies would start growing dramatically. This friction around assets has a direct tie to how slowly money moves around the globe, and the overall cost and friction from that.

So how big could it go? Well, I think that we are going to innovate together with Everledger and other partners using the Hyperledger framework. We are going to add every buyer and seller on the Ariba Network onto the chain. They are just going to get it as part of our platform.

Then we are going to begin ledgering all the transactions that they think make sense between themselves. We are going to release a couple of key functions, such as smart contracts, so their contract business rules can be applicable in the flow of commerce -- at the time commerce is happening, not locked up in some contract, or in some drawer or Portable Document Format (PDF) file. We are going to start with those things.

I don't know what applications we are going to build beyond that, but that's the excitement of it. I think the fact that we don't know is the big play.

Gardner: From a business person’s perspective, they don’t probably care too much that it’s Blockchain that’s enabling this, just like a lot of people didn't care 20 years ago that it was the Internet that was allowing them to shop online or send emails to anybody anywhere. What is it that we would tease out of this, rather than what the technology is, what's the business benefit that people should be thinking about?

Fox: Everybody wants digital trust, right? Leanne, why don’t you share some of the things you guys have been exploring?

Making the opaque transparent

Kemp: In the diamond industry, there is fraud related to document tampering. Typically paper certificates exist across the backbone, so it’s very easy to be able to transpose those into a PDF and make appropriate changes for self-gain.

Double-financing of the pipeline is a very real problem; invoicing, of course accounts receivable, they have the ability to have banks finance those invoices two, three, four times.

We have issues with round-tripping of diamonds through countries, where transfer pricing isn't declared correctly, along with the avoidance of tax and duties.

All of these issues are the dark side of the market. But, now we have the ability to bring transparency around any object, particularly in diamonds -- the one commodity that’s yet to have true financial products wrapped around it. Now, what do I mean by that? It doesn’t have a futures market yet. It doesn’t have exchange traded funds (ETFs), but the performance of diamonds has outperformed gold, platinum and palladium.

Now, what does this mean? It means we can bring transparency to the once opaque, have the ability to know if an object has gone through an ethical chain, and then realize the true value of that asset. This process allows us to start and think about how new financial products can be formed around these assets.

We are hugely interested in rising asset classes beyond just the commodity section of the market. This platform shift is like going from the World Wide Web to the World Wide Ledger. Joe was absolutely correct when he mentioned that the Internet hasn't been woven for transactional trust -- but we have the ability to do this now.

So from a business perspective, you can begin to really innovate on top of this exponential set of technology stacks. A lot of companies quote Everledger as a Blockchain company. I have to correct them and I say that we are an emerging technology company. We use the very best of Blockchain and smart contracts, machine vision, sensorial data points, for us to be able to form the identity of objects.

Now, why is that important? Most financial services companies have really been focused on Know Your Customer (KYC), but we believe that it's Know Your Object (KYO) that really creates an entirely new context around it.

Now, that transformation and the relationship of the object have already started to move. When you think about Internet of Things (IoT), mobile phones, and autonomous cars -- these are largely devices to the fabric of the web. But are they connected to the fabric of the transactions and the identity around those objects?

Insurance companies have begun to understand this. My work in the last 10 years has been deeply involved in insurance. As you begin to build and understand the chain of trust and the chain of risk, then tectonic plate shifts in financial services begin to unfold.

Apps and assets, on and off the chain

Fox: It’s not just about the chain, it's about the apps we build on top, and it's really about what is the value to the buyer and the seller as we build those apps on top.

To Leanne’s point, it’s first going to be about the object. The funny thing is we have struggled to be able to, in a digital way, provide visibility and control of an object and this is going to fix that. In the end, B2B, which is where SAP Ariba is, is about somebody getting something and paying for it. And that physical asset that they are getting is being paid for with another asset. They are just two different forms. By digitizing both and keeping that in a ledger that really cannot be altered -- it will be the truth, but it's open to everyone, buyers and sellers.

Businesses will have to invent ways to control how frictionless this is going to be. I will give you a perfect example. In the past if I told you I could do an international payment of $1 million to somebody in two minutes, you would have told me I was crazy. With Blockchain, one corporation can pay another corporation $1 million in two minutes, internationally.

And on the chain companies like Everledger can build capabilities that do the currency translation on the fly, as it’s passing through, and that doesn’t dis-remediate the banks because how did the $1 million get onto the chain in the first place? Someone put it on the chain through a bank. The bank is backing that digital version. How does it get off the chain so you can actually do something with it? It goes through another bank. It’s actually going to make the banks more important. Again, Blockchain is only as good as the external trust that it inherits.

I really think we have to focus on getting the chain out there and really building these applications on top.

Listen to the podcast. Find it on iTunes. Get the mobile appRead a full transcript or download a copy. Sponsor: SAP Ariba.

You may also be interested in:

Inside story of building a global security operations center for cyber defense

The next BriefingsDirect inside story examination of security best practices focuses on the building of a global security operations center (SOC) for cyber defense. 

Learn here how Zayo Group in Boulder, Colorado built a state-of-the-art SOC as it expanded its international managed security service provider practice.

Listen to the podcast. Find it on iTunes. Get the mobile app. Download the transcript.

Hear directly from Mike Vamvakaris, Vice President of Managed Cyber Security at Zayo Group, on the build-out, best practices, and end-results from this impressive project. The moderator is Serge Bertini, Vice President of Sales and General Manager of the Canada Security Division at Hewlett Packard Enterprise (HPE).

Serge Bertini: Mike, you and I have talked many times about the importance of managed security service providers (MSSPs), global SOCs, but for our readers, I want to take them back on the journey that you and I went through to get into the SOC business, and what it took from you to build this up.

So if you could, please describe Zayo’s business and what made you decide to jump into the MSSP field.

Mike Vamvakaris: Thanks for the opportunity. Zayo Group is a global communications and infrastructure provider. We serve more than 365 markets. We have 61 international data centers on-net, off-net, and more than 3,000 employees.

Vamvakaris

Vamvakaris

Zayo Canada required a SOC to serve a large government client that required really strict compliance, encryption, and correlational analysis.

Upon further expansion, the SOC we built in Canada became a global SOC, and now it can serve international customers as well. Inside the SOC, you will find things such as US Federal Information Processing Standard (FIPS) 140-2 security standards compliance. We do threat hunting, threat intelligence. We are also doing machine learning, all in a protected facility via five-zone SOC.

This facility was not easy to build; it was a journey, as we have talked about many times in person, Serge.

Holistic Security

Bertini: What you guys have built is a state-of-the-art facility. I am seeing how it helps you attract more customers, because not only do you have critical infrastructure in your MSSP, but also you can attract customers whose stringent security and privacy concerns can be met.

Vamvakaris: Zayo is in a unique position now. We have grown the brand aggressively through organic and inorganic activities, and we are able to offer holistic and end-to-end security services to our customers, both via connectivity and non-connectivity.

For example, within our facility, we will have multiple firewalling and distributed denial-of-service (DDoS) technologies -- now all being protected and correlated by our state-of-the-art SOC, as you described. So this is a really exciting and new opportunity that began more than two years ago with what you at HPE have done for us. Now we have the opportunity to turn and pivot what we built here and take that out globally.

Bertini: What made you decide on HPE ArcSight, and what did you see in ArcSight that was able to meet your long-term vision and requirements?

Turnkey Solutions

Vamvakaris: That’s a good question. It wasn’t an easy decision. We have talked about this openly and candidly. We did a lot of benchmarking exercises, and obviously selected HPE ArcSight in the end. We looked at everyone, without going into detail. Your listeners will know who they are.

But we needed something that supported multi-tenancy, so the single pane of window view. We are serving multiple customers all over the world, and ArcSight allowed us to scale without applying tremendous amount of capital expenditure (CAPEX) investment and ongoing operational expenditure (OPEX) to support infrastructure and the resources inside the SOC. It was key for me on the business side that the business-case was well supported.

We had a very strict industry regulation in working with a large government customer, to be FIPS-compliant. So out of the box, a lot of the vendors that we were looking at didn’t even meet those requirements.

Another thing I really liked about ArcSight, when we did our benchmarking, is the event log filtration. There really wasn’t anyone else that could actually do the filtration at the throughput and the capacity we needed. So that really lent itself very well. Just making sure that you are getting the salient events and kind of filtering out the noncritical alerts that we still need to be looking at was key for us.

Something that you and I have talked about is the strategic information and operations center (SIOC) service. As a company that knew we needed to build around SOC, to protect our own backbone, and offer those services to our extended connectivity customers, we enlisted SIOC services very early to help us with everything from instant response management, building up the Wiki, even hiring and helping us retain critical skill sets in the SOC.

From an end-to-end perspective, this is why we went with ArcSight and HPE. They offered us a turnkey solution, to really get us something that was running.

The Trifecta: People, Process, Technology

Bertini: In this market, what a lot of our customers see is that their biggest challenge is people. There are a lot of people when it comes to setting up MSSPs. The investment that you made is the big differentiator, because it’s not just the technology, it’s the people and process. When I look at the market and the need in this market, there is a lack of talented people.

Bertini

Bertini

How did you build your process and the people? What did you have to do yourself to build the strength of your bench? Later on we can talk a little bit more about Zayo and how HPE can help put all of this together.

Vamvakaris: We were the single tenant, if you will. Ultimately we needed to go international very quickly. So we went from humble beginnings to an international capability. It’s a great story.

For us, you nailed it on the head. SOC, the technology obviously is pertinent, you have to understand your use cases, your policies that you are trying to use and protect your customers with those. We needed something very modular and ArcSight worked for that.

But within the SOC, our customers require things like customized reporting and even customized instant-response plans that are tailored to meet their unique audits or industry regulations. It’s people, process and tools or technology, as they say. I mean, that is the lifeline of your SOC.

One of the things we realized early on, you have to focus on everything from your triage, to instant response, to your kill-chain processes. This is something we have invested significantly in, and this is where we believe we actually add a lot of value to our customers.

Bertini: So it’s not just a logging capability, you guys went way beyond providing just the eyes on the glass to the red team and the tiger team and everything else in between.

Vamvakaris: Let me give you an example. Within the SOC, we have SOC Level 1, all the way to Level 3, and then we have threat hunting. So inside we do threat intelligence. We are now using machine-learning technologies. We have threat hunting, predictive analytics, and we are moving into user behavior analysis.

Remember the way I talked about SOC Level 1, Level 2, Level 3, this is a 24x7, 365-day facility. This is a five-zone SOC for enhanced access control, mantraps inside to factor biometric access control. It’s a facility that we are very proud of and that we love showcasing.  

Bertini: You are a very modest person, but in the span of two years you have done a lot. You started with probably one of the largest mammoth customers, but one thing that you didn’t really talk about is, you are also drinking your own champagne.

Tell us a little bit more about, Zayo. It’s a large corporation, diverse and global. Tell us about the integration of Zayo into your own SOC, too.

Drinking your own Champagne

Vamvakaris: Customers always ask us about this. We have all kinds of fiber or Ethernet, large super highway customers I call them, massive data connectivity, and Zayo is well-known in the industry for that; obviously one of the leaders.

The interesting part is that we are able to turn and pivot, not only to our customers, but we are also now securing our own assets -- not just the enterprise, but on the backbone.

So you are right, we sip our own champagne. We protect our customers from threats and unauthorized data exfiltration, and we also do that for ourselves. So we are talking about a global multinational backbone environment.

Bertini: That’s pretty neat. What sort of threats are you starting to see in the market and how are you preventing those attacks, or at least how can you be aware in advance of what is coming down the pipe?

Vamvakaris: It’s a perpetual problem. We are invested in what’s called an ethical hacking team, which is the whole white hat/black hat piece.

In practice, we’re trying to -- I won’t say break into networks, but certainly testing the policies, the cyber frameworks that companies think they have, and we go out of our way to make sure that that is actually the case, and we will go back and do an analysis for them.

If you don’t know who is knocking at the door, how are you going to protect yourself, right?

So where do I see the market going? Well, we see a lot of ransomware; we see a lot of targeted spear phishing. Things are just getting worse, and I always talk about how this is no longer an IT issue, but it’s a business problem.

People now are using very crafty organizational and behavior-style tactics of acquiring identities and mapping them back to individuals in a company. They can have targeted data exfiltration by fooling or tricking users into giving up passwords or access and sign all types of waivers. You hear about this everyday somewhere that someone accidentally clicked on something, and the next thing you know they have wired money across the world to someone.

So we actually see things like that. Obviously we’re very private in terms of where we see them and how we see them, but we protect against those types of scenarios.

Gone are the days where companies are just worried about their customer provided equipment or even cloud firewalls. The analogy I say, Serge, is if you don’t know who is knocking at the door, how are you going to protect yourself, right?

You need to be able to understand who is out there, what they are trying to do, to be able to mitigate that. That’s why I talk about threat hunting and threat intelligence.

Partners in Avoiding Crime

Bertini: I couldn’t agree more with you. To me, what I see is the partnership that we built between Zayo and HPE and that’s a testament of how the business needs to evolve. What we have done is pretty unique in this market, and we truly act as a partner, it’s not a vendor-relationship type of situation.

Can you describe how our SIOC was able to help you get to the next level, because it’s about time-to-market, at the end of the day. Talk about best practices that you have learned, and what you have implemented.

Vamvakaris: We grew out to be an international SOC, and that practice began with one large request for proposal (RFP) customer. So we had a time-to-market issue compressed. We needed to be up and running, and that’s fully turnkey, everything.

When we began this journey, we knew we couldn’t do it ourselves. We selected the technology, we benchmarked that, and we went for the Gartner Magic Quadrant. We were always impressed at HPE ArcSight, over the years, if not a decade, that it’s been in that magic quadrant. That was very impressive for us.

But what really stood out is the HPE SIOC.

We enlisted the SIOC services, essentially the consulting arm of HPE, to help us build out our world-class multizone SOC. That really did help us get to market. In this case, we would have been paying penalties if we weren’t up and running. That did not happen.

The SIOC came in and assessed everything that we talked about earlier, they stress-tested our triage model and instant response plan. They helped us on the kill chain; they helped us with the Wiki. What was really nice and refreshing was that they helped us find talent where our SOC is located. That for me was critical. Frankly, that was a differentiator. No one else was offering those types of services.

Bertini: How is all of this benefitting you at the end of the day? And where do you see the growth in your business coming for the next few years?

Ahead in the Cloud

Vamvakaris: We could not have done this on our own. We are fortunate enough that we have learned so much now in-house.

But we are living in an interconnected world. Like it or not, we are about to automate that world with the Internet of things (IoT), and always-on mobile technologies, and everyone talks about pushing things to the cloud.

The opportunity for us is exciting. I believe in a complete, free, open digital world, which means we are going to need -- for a long time -- to protect the companies as they move their assets to the cloud, and as they continue to do mobile workforce strategies -- and we are excited about that. We get to be a partner in this ecosystem of a new digital era. I think we are just getting started.

The timing then is perfect, it’s exciting, and I think that we are going to see a lot of explosive growth. We have already started to see that, and now I think it’s just going to get even more-and-more exciting as we go on.

It’s not just about having the human capabilities, but it's also augmenting them with the right technologies and tools so they can respond faster, they can get to the issues.

Bertini: You have talked about automation, artificial intelligence (AI), and machine learning. How are those helping you to optimize your operations and then ultimately benefitting you financially?

Vamvakaris: As anyone out there who has built a SOC knows, you’re only as good as your people, processes, and tools. So we have our tools, we have our processes -- but the people, that cyber security talent is not cheap. The SOC analysts have a tough job. So the more we can automate, and the more we can give them help, the better. A big push now is for AI, which really is machine learning, and automating and creating a baseline of things from which you can create a pattern, if you will, of repeatable incidents, and then understanding that all ahead of time.

We are working with that technology. Obviously HPE ArcSight is the engine to the SOC, for correlational analysis, experience-sampling methods specifically, but outside there are peripherals that tie into that.

It’s not just about having the human capabilities, but it's also augmenting them with the right technologies and tools so they can respond faster, they can get to the issues; they can do a kill chain process quickly. From an OPEX perspective, we can free up the Level 1 and Level 2 talent and move them into the forensic space. That’s really the vision of Zayo.

We are working with technologies including HPE ArcSight to plug into that engine that actually helps us free up the incident-response and move that into forensics. The proactive threat hunting and threat intelligence -- that’s where I see the future for us, and that’s where we’re going.

Bertini: Amazing. Mike, with what you have learned over the last few years, if you had to do this all over again, what would you do differently?

Practice makes perfect

Vamvakaris: I would beg for more time, but I can’t do that. It was tough, it was tough. There were days when we didn’t think we were going to make it. We are very proud and we love showcasing what we built -- it’s an amazing, world-class facility.

But what would I do differently? We probably spent too much time second-guessing ourselves, trying to get everything perfect. Yet it’s never going to be perfect. A SOC is a living, breathing thing -- it's all about the people inside and the processes they use. The technologies work, and getting the right technology, and understanding your use cases and what you are trying to achieve, is key. Not trying to make it perfect and just getting it out there and then being more flexible in making corrections, [that would have been better].

In our case, because it was a large government customer, the regulations that we had to meet, we built that capability the first time, we built this from the ground up properly -- as painful as that was, we can now learn from that.

In hindsight, did we have to have everything perfect? Probably not. Looking back at the compressed schedule, being audited every quarter, that capability has nonetheless put us in a better place for the future.

Bertini: Mike, kudos to you and your team. I have worked with your team for the last two to three years, and what you have done has showed us a miracle. What you built is a top-class MSSP, with some of the most stringent requirements from the government, and it shows.

Now, when you guys talk, when you present to a customer, and when we do joint-calls with the customers -- we are an extension of each other. We at HPE are just feeding you the technology, but how you have implemented it and built it together with your people, process, and technology -- it’s fantastic.

So with that, I really thank you. I'm looking forward to the next few years together, to being successful, and bringing all our customers under your roof.

Vamvakaris: This is the partnership that we talked about. I think that’s probably the most important thing. If you do endeavor to do this, you really do need to bring a partner to the table. HPE helped us scale globally, with cost savings and an accelerated launch. That actually can happen with a world-class partnership. So I also look forward to working with you, and serving both of our customer bases, and bringing this great capability out into the market.

Listen to the podcast. Find it on iTunes. Get the mobile app. Download the transcript. Sponsor: Hewlett Packard Enterprise.

You may also be interested in:

Why effective IoT adoption is a team sport

TasmaNet Ups its Cloud Game to Deliver a Regional Digital Services Provider Solution

Logicalis Chief Technologist Defines the New Ideology of Hybrid IT

Converged IoT systems: Bringing the data center to the edge of everything

IDOL-powered appliance delivers better decisions via comprehensive business information searches

OCSL sets its sights on the Nirvana of hybrid IT—attaining the right mix of hybrid cloud for its clients

Fast acquisition of diverse unstructured data sources makes IDOL API tools a star at LogitBot

How lastminute.com uses machine learning to improve travel bookings user experience

Veikkaus digitally transforms as it emerges as new combined Finnish national gaming company

HPE takes aim at customer needs for speed and agility in age of IoT, hybrid everything

Diversity spend: When doing good leads to doing well

The next BriefingsDirect digital business thought leadership panel discussion focuses on the latest path to gaining improved diversity across inclusive supply chains.

The panel examines why companies are seeking to improve supplier diversity, the business and societal benefits, and the new tools and technologies that are making attaining inclusive suppliers easier than ever.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

To learn more about the increasingly data-driven path to identifying and achieving the workforce that meets all requirements, please welcome Rod Robinson, Founder and CEO of ConnXus; Jon Stevens, Global Senior Vice President of B2B Commerce and Payments at SAP Ariba, and Quentin McCorvey, Sr., President of M and R Distribution Services

The panel was assembled and recorded at the recent 2017 SAP Ariba LIVE conference in Las Vegas. The discussion was moderated by Dana Gardner, principal analyst at Interarbor Solutions

Here are some excerpts:

Gardner: Jon, why is it important to seek diversity in procurement and across supply chains? What are the reasons for doing this?

Stevens: It’s a very good question. It's for few reasons. Number one, there is a global war for talent, and when you can get a diverse point of view, when you can include multiple different perspectives, that usually helps drive several other benefits, one of which could even be innovation.

We often see companies investing deeply inside their supply chain, working with a diverse set of suppliers, and they are gaining huge rewards from an innovation standpoint. When you look at the leading companies that leverage their suppliers to help drive new product innovation, it usually comes from these areas.

We also see companies more focused on longer-term relationships with their suppliers. Having a diverse perspective -- and having a set of diverse suppliers -- helps with those longer-term relationships, as both companies continue to grow in the process.

Gardner: Rod, what are you seeing in the marketplace as the major trends and drivers that have more businesses seeking more inclusivity and diversity in their suppliers?

Diversity benefits business

Robinson: As a former chief procurement officer (CPO), the one thing that I can definitely say that I have witnessed is that more diverse and inclusive supply chains are more innovative and deliver high value.

I recently wrote a blog where I highlighted some statistics that I think every procurement professional should know: One is that 99.9% of all US firms are in a small business category. Women- and minority-owned businesses represent more than 50% of the total, which is responsible for employing around 140 million people.

Robinson

Robinson

This represents a significant portion of the workforce. As we all know, small businesses really are the economic engine of the economy – small businesses are responsible for 65% of net new jobs.

At the end of the day, women and minorities represent more than 50% of all businesses, but they only represent about 6% of the total revenue generated.

The only thing that I would add is that diversity is vitally important as an economic driver for our economy.

Gardner: Rod points out a rich new wellspring of skills, talent and energy coming up organically from the small to medium-sized businesses. On the other hand, major national and international brands are demanding more inclusivity and diversity from their suppliers. If you are in the middle of that supply chain, is this something that should interest you?

Targeting talent worldwide

Stevens: You are spot-on. We definitely see our leading customers looking across that landscape, whether they are a large- or medium-sized company. The war for talent is only going to increase. Companies will need to seek even more diverse sources of talent. They are really going to have to stretch themselves to look outside the walls of their country to find talent, whereas other companies may not be doing so. So you're going to see rising diversity programs.

Stevens

Stevens

We have several customers in emerging parts of the world; let's take South Africa for example. I spend a lot of time in South Africa, and one of our customers there, Nedbank, invests a lot of time and a lot of money in the growth and development of the small businesses. In South Africa, the statistics that Rod talked about are even greater as far as the portion of small companies. So we are seeing that trend grow even faster outside of the US, and it's definitely going to continue.

Gardner: Rod, you mentioned that there are statistics, studies and research out there that indicate that this isn't just a requirement, it's really good business. I think McKinsey came out with a study, too, that found the top quarter of those companies seeking and gaining gender, racial and ethnic diversity were more likely to have a better financial return. So this isn't just the right thing to do, but it's also apparently demonstrated as being good business, too. Do you have any other insights into why the business case for this is so strong?

Diversity delivers innovation

Robinson: Speaking from first-hand experience, having been responsible for procurement and supplier diversity within a large company, there were many drivers. We had federal contracts that required us to commit to a certain level of engagement (and spending) with diverse suppliers.  We had to report on those stats and report our progress on a monthly and/or quarterly basis. It was interesting that while we were required by these contractual mandates -- not only from the government but also customers like Procter and Gamble, Macy's, and others -- we started to realize that this is really creating more competition within categories that we were taking to market. It was bringing value to the organizations.

We had situations where we were subcontracting to diverse suppliers that were providing us with access to markets that we didn't even realize that we were missing. So again, to Jon's point, it's more than just checking a box. We began to realize that this is really a market-imperative. This is something that is creating value for the organization.

The whole concept of supplier diversity started with the US government back in the late ’60s and early ’70s. That was the catalyst, but companies realized that it was delivering significant value to the organization, and it's helped to introduce new, innovative companies across the supply chain.

At ConnXus, our big break came when McDonald's gave us an opportunity five years ago. They took a chance on us when we were a start-up company of four.  We are now a company of 25. Obviously, revenues have grown significantly and we've been able to attract partners like SAP Ariba. That's the way it should work. You always want to look for opportunities to identify new, innovative suppliers to introduce into a supply chain; otherwise we get stagnant.

Small but mighty

Stevens: I'll add to what Rod said. This is just the sort of feedback we hear from our customers, the fact that a lot of the companies that are in this inclusive space are small -- and we think that's a big advantage.

Speed, quickness and flexibility are something you often see from diverse suppliers, or certainly smaller businesses, so a company that can have that in its portfolio has better responsiveness to their customer needs, versus a supply chain with very large processes or large organizations where it takes a while to respond to market needs. The quick in today's world will be far more successful, and having a diverse set of suppliers allows you to respond incredibly quickly. There is obviously a financial benefit in doing so.

Gardner: A big item of conversation here at SAP Ariba LIVE is how to reduce risk across your supply chain. Just like any economic activity, if you have a diversified portfolio, with different sizes of companies, different geographic locations, and different workforce components -- that can be a real advantage.

Now that we've established that there is a strong business case and rationale for seeking diversity, why do procurement professionals have trouble finding that diversity? Let's go to Quentin. What's holding back procurement professionals from finding the companies that they want?

McCorvey: Probably the biggest challenge is that the whole trend of supply chain optimization, of driving cost out of the supply chain, seems to be at odds with being inclusive, responsive, and in bringing in your own diverse suppliers. A company may have had 20 to 30 suppliers of a product, and then they look to drive that down with to just one or two suppliers. They negotiate contract prices for three-year contracts. That tends to weed out some of the smaller, more diverse organizations for several reasons.

McCorvey

McCorvey

For example, Rod talked about McDonald’s taking a chance on him. Well, they took a chance on him being a four-person organization; if he had to [grow first] he never would have had the opportunity.

For a company that requires a product in the market for every location nationally -- as opposed to regionally -- at a certain price, that tends to challenge a lot of the inclusion or the diversity in the supply chain.

Gardner: Right. Some companies have rules in place that don't provide the flexibility to attract a richer supplier environment. What is being done from your perspective at SAP Ariba, Jon, to go after such a calcification of rules that leads to somewhat limited thinking in terms of where they can find choices?

Power through partnerships

Stevens: That short-term thinking that Quentin talked about is absolutely one of the big barriers, and that generally comes down to metrics. What are they trying to measure? What are they trying to accomplish?

The more thought-leading companies are able to look past something in the first year or two, and focus on not just driving cost out, as Quentin talked about, but discovering what else their suppliers can help with, whether it’s something from a regulatory standpoint or something from a product and innovation perspective.

Certainly, one challenge is that short-term thinking, the other is access to information. We see far too many procurement organizations that just aren't thinking on a broader scale, whether it's a diverse scale or a global scale. What SAP Ariba is now bringing to the table with our solutions is being able to include information about where to find diverse suppliers, where to search and locate suppliers, and we do that through many partnerships.

We have a solution in South Africa called Tradeworld, which addresses this very topic for that market. We have a solution called SAP Ariba Spot Buy, which allows us to bring diverse suppliers automatically into a catalog for procurement organizations to leverage. And at SAP Ariba LIVE 2017 we announced that we are partnering with Rod and his firm, ConnXus, to expand the diversity marketplace by linking the ConnXus database and the SAP Ariba Network, which opens the door to more opportunities for all of our customers.

Robinson: If I could add to Jon’s point, one thing I also look forward to as a part of our partnership with SAP Ariba is thought leadership. There are opportunities for us to share best practices. We know companies who are doing it really well, we know the companies that maybe struggling with it, but within our joint customer portfolios, we will be able to share some of those best practices.

For example, there may be situations where a company is doing a big maintenance, repair and operations (MRO) bid and you have some large players involved, such as W.W. Grainger. There may be opportunities to introduce Grainger to smaller suppliers that maybe provide fewer stock keeping units (SKUs) that they can leverage strategically across their accounts. I have been involved in a number of initiatives like that. Those are the types of insights that we will be able to bring to the table, and that really excites me about this partnership.

Gardner: Those insights, that data, and the ability to leverage a business network to automate and facilitate that all at scale is key. From what we are hearing here at SAP Ariba LIVE, leveraging that business network is essential. Rod, tell us aboutConnXus? What’s being announced here?

Seek and ye shall find in the connected cloud

Robinson: ConnXus is a next-generation procurement platform that specializes in making corporate supply chains more inclusive, transparent, and compliant. As I mentioned, we serve several global companies, many of which we share relationships with SAP Ariba.  Our cloud-based platform makes it easy for companies to track, monitor, and report against their supplier diversity objectives.

One of the major features is our supplier database, which provides real-time searchable access to nearly two million vetted women-, minority- and veteran-owned businesses across hundreds of categories. We integrate with the SAP Ariba Network. That makes it simple for companies to identify vetted, diverse suppliers. They can also search on various criteria including certifications, category, and geography. We have local, national and global capabilities.  SAP Ariba already is in a number of markets that we are looking to penetrate.

Gardner: I was really impressed when I looked at the ConnXus database, how rich and detailed it is, and not just ownership of companies but also the composition of those companies, where those people are located. So you would actually know where your inclusive supply chain is going to be, where the rubber hits the road on that, so to speak.

Jon, tell us about the news here on March 21, 2017, a marriage between SAP Ariba and ConnXus.

Stevens: The SAP Ariba Network has a community of over 2.5 million companies, and it’s companies like M and R Distribution Services that we have been able to help grow and foster over time, using some of the solutions I talked about and Ariba Discovery.

Adding to the information that Rod just talked about, we are greatly expanding that. We have the world’s largest, most global business network and now we have the world’s most diverse business network, due to the partnership with ConnXus being able to provide that information through various processes.

Fortune 2000 companies are looking all the time through requests for proposal (RFPs), through sourcing events, and analyzing supplier performance on the SAP Ariba Network. The partnership with ConnXus will allow us to provide a lot more education, a lot more awareness to them.

For the suppliers that are on our network and those who will be joining us as a part of being in ConnXus, we expect to drive a lot more business.

Gardner: If I am a purchasing agent or a procurement officer and I want to improve my supplier inclusion program, how would something like, say, SAP Ariba Spot Buy using the ConnXus database, benefit me?

Stevens: As you decide to search for a category, we will return to you several things, one of which is now the diverse supplier list that ConnXus has. One of the things we are going to be doing with SAP Ariba Spot Buy is to have a section that highlights the diversity category so that it’s front and center for a purchasing agent to use and to take advantage of.

Gardner: Clearly there is strong value and benefit here if you are a procurement officer to get involved with the ConnXus database and Ariba Network. Quentin, at M and R Distribution Services, tell us from the perspective of a small supplier like yourself, what you're hearing about Ariba and ConnXus that interests you?

Be fruitful and multiply business opportunities

McCorvey: You referenced a marriage between SAP Ariba and ConnXus, and part of a marriage is to be fruitful and multiply. So I want them to be fruitful so I can multiply my business opportunities. What that does for a company like ours is, we are looking for opportunities. It’s tougher for me to compete as a small business against a Grainger, or against a Fastenal, or against other larger companies like that.

So when I am going after opportunities like that, it’s going to be tough for me to win those large-scale RFPs. But if there is a target spot opportunity that I am looking for or within a region, it’s something that I can begin to do if a company is looking for someone like me.

We’ve talked a lot about corporations and the benefit of corporations, but there is also a consumer benefit, too, because we are in an age where the consumer is socially responsible and really wants to have a company that they are either investing in or they’re buying products from and they look for inclusion in their supply chains.

Folks are looking at that when they are make their investment and consumer decisions. Every company has an extremely diverse consumer base, so why should they not have a diverse supplier base? When companies look at that business ethic and corporate social responsibility as a driving tool for their organization, I want them to be able to find me among the Fortune top 20 companies. The relationship that ConnXus and SAP Ariba are driving really catalyzes these opportunities for me.

Gardner: Rod, if a company like M and R Distribution Services is not yet in your database and they want to be, how might they get going on that process and become vetted and be available to a global environment like the Ariba Network?

Robinson: It’s really simple. One of the things that we have striven to provide is a fantastic, simple user experience. It takes about six minutes to complete the initial supplier profile. Any supplier can complete a profile at no cost.

Many suppliers actually get into our database because of the services that we already provide to large enterprise customers. So if you are a McDonald's supplier, for example, you are already going to be in our database because we scrub their vendor data on an annual basis. I think Quentin is already in because he happens to be a vendor of one of our customers, or of multiple customers.

There is a vetting process where we integrate with other third-parties to pull in data, and then you become discoverable by all of the buyers on our platform.

Gardner: Before we close out, let’s look to the future. Jon, when we think about getting this rich data, putting it in the hands of the people who can use it, we also are putting it in the hands of the machines that can use it, right?

So when we think about bots and artificial intelligence (AI) trends, what are some of your predictions for how the future will come about when it comes to procurement and inclusive supply chains?

The future is now

Stevens: You talked about trends. One is certainly around transparency and visibility; another one is around predictive analytics and intelligence. We believe that a third is around partnerships like this to drive more collaboration.

But predictive analytics, that’s not a future thing, that's something we do today and some of the leading procurement companies are figuring out how to take advantage of it. So, for example, when a machine breaks down, you are not waiting for it. Instead, the machine is telling our systems, “Hey, wait a minute, I've got a problem.”

Not only that, but they are producing for the buyer the intelligence that they need to order something. We already know who the suppliers are, we already know what potentially should be done, and we are providing these decisions to procurement organizations.

The future, it’s here, you see it in our personal lives, on our phones, when you get recommendations in the morning, on the news, and everything else. It’s here today through some of our solutions.

And this trend around diversity, it’s also here. You mentioned SAP Ariba Spot Buy and we also have some of these other solutions like SAP Ariba Discovery where a procurement person is starting to create a sourcing event. We have the ability in our solutions to automatically recommend suppliers and based off of the goals that that procurement organization has, we can pre-populate and recommend the diverse MRO suppliers that you might want to consider for your program.

You’re seeing that today through the Ariba Network and through things like Guided Buying, where we are helping facilitate many of those steps for procurement organizations. So it's really fun and the future in many respects is here right now.

Value-driven supply chains

Robinson: I envision a future in procurement of being able to make informed decisions on supplier selection. Procurement professionals are in a great position to change the world, and the CPO of the future; they are going to be Millennials. They want more control, and they want more transparency, and, to Quentin’s point, they want to buy from organizations that share their same values.

Our partnership with SAP Ariba will create this environment where we can move closer to fulfilling this vision of whenever you have a specification that you’ve put into the system, you’ll be pushed supplier options, and you can actually configure your criteria such that you create this optimal supplier mix – whether diversity is important to you, green/environmental issues are important you, if ethical practices are important to you. All of this can be built-in and weighted within your selection. You will create an optimal supplier portfolio that balances all of the things that are important to you and your organization.

McCorvey: Why I am excited? This conversation has come full circle for me. I started off taking about supply optimizations and some of the challenges that they pose for businesses like me. We know that people do business most often with people they know, like and appreciate. What I want to do is turn a digital connection into a digital handshake and use predictive analytics and the connections between Jon and Rod that propose an opportunity for folks to know me, for me to grow as a new organization, and for me to be in the forefront of their minds. That is a challenge that this kind of supply chain optimization helps to overcome.

I’m really happy for where this is going to go in the future. In the end, there are going to be a lot of organizations both large and small that are going to benefit from this partnership. I look forward to the great things that are going to come from it, for not only both organizations -- but for people like me across the country.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: SAP Ariba.

You may also be interested in:

How AI, IoT and blockchain will shake up procurement and supply chains

The next BriefingsDirect digital business thought leadership panel discussion focuses on how artificial intelligence (AI), the Internet of things (IoT), machine learning (ML), and blockchain will shake up procurement and supply chain optimization.

Stay with us now as we develop a new vision for how today's cutting-edge technologies will usher in tomorrow's most powerful business tools and processes. The panel was assembled and recorded at the recent 2017 SAP Ariba LIVE conference in Las Vegas. The discussion is moderated by Dana Gardner, principal analyst at Interarbor Solutions.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

To learn more about the data-driven, predictive analytics, and augmented intelligence approach to supply chain management and procurement, please welcome the executives from SAP Ariba:

Here are some excerpts:

Gardner: It seems like only yesterday we were confident to have a single view of a customer, or clean data, or maybe a single business process end–to-end value. But now, we are poised to leapfrog the status quo by using words like predictive and proactive for many business functions.

Why are AI and ML such disrupters to how we've been doing business processes?

Shahane: If you look back, some of the technological impact  in our private lives, is impacting our public life. Think about the amount of data and signals that we are gathering; we call it big data.

We not only do transactions in our personal life, we also have a lot of content that gets pushed at us. Our phone records, our location as we move, so we are wired and we are hyper-connected.

Shahane

Shahane

Similar things are happening to businesses. Since we are so connected, a lot of data is created. Having all that big data – and it could be a problem from the privacy perspective -- gives you an opportunity to harness that data, to optimize it and make your processes much more efficient, much more engaged.

If you think about dealing with big data, you try and find patterns in that data, instead of looking at just the raw data. Finding those patterns collectively as a discipline is called machine learning. There are various techniques, and you can find a regression pattern, or you can find a recommendation pattern -- you can find all kinds of patterns that will optimize things, and make your experience a lot more engaging.

If you combine all these machine learning techniques with tools such as natural language processing (NLP), higher-level tools such as inference engines, and text-to-speech processing -- you get things like Siri and Alexa. It was created for the consumer space, but the same thing could be available for your businesses, and you can train that for your business processes. Overall, these improve efficiency, give delight, and provide a very engaging user experience.

Gardner: Sanjay, from the network perspective it seems like we are able to take advantage of really advanced cloud services, put that into a user experience that could be conversational, like we do with our personal consumer devices.

What is it about the cloud services in the network, however, that are game-changers when it comes to applying AI and ML to just good old business processes?

Multiple intelligence recommended

Almeida: Building on Dinesh’s comment, we have a lot of intelligent devices in our homes. When we watch Netflix, there are a lot of recommendations that happen. We control devices through voice. When we get home the lights are on. There is a lot of intelligence built into our personal lives. And when we go to work, especially in an enterprise, the experience is far different. How do we make sure that your experience at home carries forward to when you are at work?

From the enterprise and business networks perspective, we have a lot of data; a lot of business data about the purchases, the behaviors, the commodities. We can use that data to make the business processes a lot more efficient, using some of the models that Dinesh talked about.

Almeida

Almeida

How do we actually do a recommendation so that we move away from traditional search, and take action on rows and columns, and drive that through a voice interface? How do we bring that intelligence together, and recommend the next actions or the next business process? How do we use the data that we have and make it a more recommended-based interaction versus the traditional forms-based interaction?

Gardner: Sudhir, when we go out to the marketplace with these technologies, and people begin to use them for making better decisions, what will that bring to procurement and supply chain activities? Are we really talking about letting the machines make the decisions? Where does the best of what machines do and the best of what people do meet?

Bhojwani: Quite often I get this question, What will be the role of procurement in 2025? Are the machines going to be able to make all the decisions and we will have no role to play? You can say the same thing about all aspects of life, so why only procurement?

I think human intelligence is still here to stay. I believe, personally, it can be augmented. Let's take a concrete example to see what it means. At SAP Ariba, we are working on a product called product sourcing. Essentially this product takes a bill of material (BOM), and it tells you the impact. So what is so cool about it?

One of our customers has a BOM, which is an eight-level deep tree with 10 million nodes in it. In this 10 million-node commodity tree, or BOM, a person is responsible for managing all the items. But how does he or she know what is the impact of a delay on the entire tree? How do you visualize that?

Bhojwani

Bhojwani

I think humans are very poor at visualizing a 10-million node tree; machines are really good at it. Well, where the human is still going to be required is that eventually you have to make a decision. Are we comfortable that the machine alone makes a decision? Only time will tell. I continue to think that this kind of augmented intelligence is what we are looking for, not some machine making complete decisions on our behalf.

Gardner: Dinesh, in order to make this more than what we get in our personal consumer space, which in some cases is nice to have, it doesn't really change the game. But we are looking for a higher productivity in business. The C-Suite is looking for increased margins; they are looking for big efficiencies. What is it from a business point of view that these technologies can bring? Is this going to be just a lipstick on a pig, so to speak, or do we really get to change how business productivity comes about?

Humans and machines working together

Shahane: I truly believe it will change the productivity. The whole intelligence advantage -- if you look at it from a highest perspective like enhanced user experience -- provides an ability to help you make your decisions.

When you make decisions having this augmented assistant helping you along the way -- and at the same time dealing with large amount of data combined in a business benefit -- I think it will make a huge impact.

Let me give you an example. Think about supplier risk. Today, at first you look at risk as the people on the network, and how you are directly doing business with them. You want to know everything about them, their profile, and you care about them being a good business partner to you.

But think about the second, third and fourth years, and some things become not so interesting for your business. All that information for those next years is not directly available on the network; that is distant. But if those signals can be captured and somehow surface in your decision-making, it can really reduce risk.

Reducing risk means more productivity, more benefits to your businesses. So that is one advantage I could see, but there will be a number of advantages. I think we'll run out of time if we start talking about all of those.

Gardner: Sanjay, help us better understand. When we take these technologies and apply them to procurement, what does that mean for the procurement people themselves?

Almeida: There are two inputs that you need to make strategic decisions, and one is the data. You look at that data and you try to make sense out of it. As Sudhir mentioned, there is a limit to human beings in terms of how much data processing that they can do -- and that's where some of these technologies will help quite a bit to make better decisions.

The other part is personal biases, and eliminating personal biases by using the data. It will improve the accuracy of your strategic decisions. A combination of those two will help make better decisions, faster decisions, and procurement groups can focus on the right stuff, versus being busy with the day-to-day tasks.

Using these technologies, the data, and the power of the data from computational excellence -- that's taking the personal biases out of making decisions. That combination will really help them make better strategic decisions.

Bhojwani: Let me add something to what Sanjay said. One of the biggest things we're seeing now in procurement, especially in enterprise software in general, is people's expectations have clearly gone up based on their personal experience outside. I mean, 10 years back I could not have imagined that I would never go to a store to buy shoes. I thought, who buys shoes online? Now, I never go to stores. I don't know when was the last time I bought shoes anywhere but online? It's been few years, in fact. Now, think about that expectation on procurement software.

Currently procurement has been looked upon as a gatekeeper; they ensure that nobody does anything wrong. The problem with that approach is it is a “stick” model, there is no “carrot” behind it. What users want is, “Hey, show me the benefit and I will follow the rules.” We can't punish the entire company because of a couple of bad apples.

By and large, most people want to follow the rules. They just don't know what the rules are; they don't have a platform that makes that decision-making easy, that enables them to get the job done sooner, faster, better. And that happens when the user experience is acceptable and where procurement is no longer looked down upon as a gatekeeper. That is the fundamental shift that has to happen, procurement has to start thinking about themselves as an enabler, not a gatekeeper. That's the fundamental shift.

Gardner: Here at SAP Ariba LIVE 2017, we're hearing about new products and services. Are there any of the new products and services that we could point to that say, aha, this is a harbinger of things to come

In blockchain we trust

Shahane: The conversational interfaces and bots, they are a fairly easy technology for anyone to adopt nowadays, especially because some of these algorithms are available so easily. But -- from my perspective -- I think one of the technologies that will have a huge impact on our life will be advent of IoT devices, 3D printing, and blockchain.

To me, blockchain is themost exciting one. That will have huge impact on the way people look at the business network. Some people think about blockchain as a complementary idea to the network. Other people think that it is contradictory to the network. We believe it is complementary to the network.

Blockchain reaches out to the boundary of your network, to faraway places that we are not even connected to, and brings that into a governance model where all of your processes and all your transactions are captured in the central network.

I believe that a trusted transactional model combined with other innovations like IoT, where a machine could order by itself … My favorite example is when a washing machine starts working when the energy is cheaper … it’s a pretty exciting use-case.

This is a combination of open platforms and IoT combining with blockchain-based energy-rate brokering. These are the kind of use cases that will become possible in the future. I see a platform sitting in the center of all these innovations.

Gardner: Sanjay, let’s look at blockchain from your perspective. How do you see that ability of a distributed network authority fitting into business processes? Maybe people hadn't quite put those two together.

Almeida: The core concept of blockchain is distributed trust and transparency. When we look at business networks, we obviously have the largest network in the world. We have more than 2.5 million buyers and suppliers transacting on the SAP Ariba Network -- but there are hundreds of millions of others who are not on the network. Obviously we would like to get them.

If you use the blockchain technology to bring that trust together, it’s a federated trust model. Then our supply chain would be lot more efficient, a lot more trustworthy. It will improve the efficiency, and all the risk that’s associated with managing suppliers will be managed better by using that technology.

Gardner: So this isn’t a “maybe,” or an “if.” It’s “definitely,” blockchain will be a significant technology for advancing productivity in business processes and business platforms?

Almeida: Absolutely. And you have to have the scale of an SAP Ariba, have the scale from the number of suppliers, the amount of business that happens on the network. So you have to have a scale and technology together to make that happen. We want to be a center of a blockchain, we want to be a blockchain provider, and so that other third-party ecosystem partners can be part of this trusted network and make this process a lot more efficient.

Gardner: Sudhir, for those who are listening and reading this information and are interested in taking advantage of ML and better data, of what the IoT will bring, and AI where it makes sense -- what in your estimation should they be doing now in order to prepare themselves as an organization to best take advantage of these? What would you advise them to be doing now in order to better take advantage of these technologies and the services that folks like SAP Ariba can provide so that they can stand out in their industry?

Bhojwani: That’s a very good question, and that's one of our central themes. At the core of it, I fundamentally believe the tool cannot solve the problem completely on its own, you have to change as well. If the companies continue to want to stick to the old processes -- but try to apply the new technology -- it doesn’t solve the problem. We have seen that movie played before. People get our tool, they say, hey, we were sold very good visions, so we bought the SAP Ariba tool. We tried to implement it and it didn’t work for us.

When you question that, generally the answer is, we just tried to use the tool -- tried to change the tool to fit our model, to fit our process. We didn’t try to change the processes. As for blockchain, enterprises are not used to being for track and trace, they are not really exposing that kind of information in any shape or form – or they are very secretive about it.

So for them to suddenly participate in this requires a change on their side. It requires seeing what is the benefit for me, what is the value that it offers me? Slowly but surely that value is starting to become very, very clear. You hear more companies -- especially on the payment side -- starting to participate in blockchain. A general ledger will be available on blockchain some day. This is one of the big ideas for SAP.

If you think about SAP, they run more general ledgers in the world than any other company. They are probably the biggest general ledger company that connects all of that. Those things are possible, but it’s still a technology only until the companies want to say, “Hey, this is the value … but I have to change myself as well.”

This changing yourself part, even though it sounds so simple, is what we are seeing in the consumer world. There, change happens a little bit faster than in the enterprise world. But, even that is actually changing, because of the demands that the end-user, the Millennials, when they come into the workforce; the force that they have and the expectations that they have. Enterprises, if they continue to resist, won’t be sustainable.

They will be forced to change. So I personally believe in next three to five years when there are more-and-more Millennials in the workforce, you will see people adopting blockchain and new ledgers at a much faster pace.

A change on both sides

Shahane: I think Sudhir put it very nicely. I think enterprises need to be open to change. You can achieve transformation if the value is clearly articulated. One of the big changes for procurement is you need to transition yourself from being a spend controller into a value creator. There is a lot of technology that will benefit you, and some of the technology vendors like us, we cannot just throw a major change at our users. We have to do it gradually. For example, with AI it will start as augmented first, before it starts making algorithmic decisions.

So it is a change on both sides, and once that happens -- and once we trust each other on the system -- nice things will happen.

Almeida: One thing I would add to that is organizations need to think about what they want to achieve in the future and adopt the tool and technology and business processes for their future business goals. It’s not about living in the past because the past is going to be gone. So how do you differentiate yourself, your business with the rest of the competition that you have?

The past business processes and people and technology many not necessarily get you over there. So how do you leverage the technology that companies like SAP and Ariba provide? Think about what should be your future business processes. The people that you will have, as Sudhir mentioned, the Millennials, they have different expectations and they won’t accept the status quo.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: SAP Ariba.

You may also be interested in:

Converged IoT systems: Bringing the data center to the edge of everything

Converged IoT systems: Bringing the data center to the edge of everything

The demands of data processing, real-time analytics, and platform efficiency at the intercept of IoT and business benefits have forced new technology approaches. We'll now learn how converged systems and high-performance data analysis platforms are bringing the data center to the operational technology (OT) edge.

Sumo Logic CEO on how modern apps benefit from 'continuous intelligence' and DevOps insights

The next BriefingsDirect applications health monitoring interview explores how a new breed of continuous intelligence emerges by gaining data from systems infrastructure logs -- either on-premises or in the cloud -- and then cross-referencing that with intrinsic business metrics information.

We’ll now explore how these new levels of insight and intelligence into what really goes on underneath the covers of modern applications help ensure that apps are built, deployed, and operated properly.

Today, more than ever, how a company's applications perform equates with how the company itself performs and is perceived. From airlines to retail, from finding cabs to gaming, how the applications work deeply impacts how the business processes and business outcomes work, too.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript ordownload a copy.

We’re joined by an executive from Sumo Logic to learn why modern applications are different, what's needed to make them robust and agile, and how the right mix of data, metrics and machine learning provides the means to make and keep apps operating better than ever.

To describe how to build and maintain the best applications, welcome Ramin Sayar, President and CEO of Sumo Logic. The discussion is moderated by BriefingsDirect's Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: There’s no doubt that the apps make the company, but what is it about modern applications that makes them so difficult to really know? How is that different from the applications we were using 10 years ago?

Sayar: You hit it on the head a little bit earlier. This notion of always-on, always-available, always-accessible types of applications, either delivered through rich web mobile interfaces or through traditional mechanisms that are served up through laptops or other access points and point-of-sale systems are driving a next wave of technology architecture supporting these apps.

These modern apps are around a modern stack, and so they’re using new platform services that are created by public-cloud providers, they’re using new development processes such as agile or continuous delivery, and they’re expected to constantly be learning and iterating so they can improve not only the user experience -- but the business outcomes.

Gardner: Of course, developers and business leaders are under pressure, more than ever before, to put new apps out more quickly, and to then update and refine them on a continuous basis. So this is a never-ending process.

User experience

Sayar: You’re spot on. The obvious benefits around always on is centered on the rich user interaction and user experience. So, while a lot of the conversation around modern apps tends to focus on the technology and the components, there are actually fundamental challenges in the process of how these new apps are also built and managed on an ongoing basis, and what implications that has for security. A lot of times, those two aspects are left out when people are discussing modern apps.

 

Gardner: That's right. We’re now talking so much about DevOps these days, but in the same breath, we’re taking about SecOps -- security and operations. They’re really joined at the hip.

Sayar: Yes, they’re starting to blend. You’re seeing the technology decisions around public cloud, around Docker and containers, and microservices and APIs, and not only led by developers or DevOps teams. They’re heavily influenced and partnering with the SecOps and security teams and CISOs, because the data is distributed. Now there needs to be better visibility instrumentation, not just for the access logs, but for the business process and holistic view of the service and service-level agreements (SLAs).

Gardner: What’s different from say 10 years ago? Distributed used to mean that I had, under my own data-center roof, an application that would be drawing from a database, using an application server, perhaps a couple of services, but mostly all under my control. Now, it’s much more complex, with many more moving parts.

Sayar: We like to look at the evolution of these modern apps. For example, a lot of our customers have traditional monolithic apps that follow the more traditional waterfall approach for iterating and release. Often, those are run on bare-metal physical servers, or possibly virtual machines (VMs). They are simple, three-tier web apps.

Access the Webinar
On Gaining Operational Visibility
Into AWS

We see one of two things happening. The first is that there is a need for either replacing the front end of those apps, and we refer to those as brownfield. They start to change from waterfall to agile and they start to have more of an N-tier feel. It's really more around the front end. Maybe your web properties are a good example of that. And they start to componentize pieces of their apps, either on VMs or in private clouds, and that's often good for existing types of workloads.

The other big trend is this new way of building apps, what we call greenfield workloads, versus the brownfield workloads, and those take a fundamentally different approach.

Often it's centered on new technology, a stack entirely using microservices, API-first development methodology, and using new modern containers like Docker, Mesosphere, CoreOS, and using public-cloud infrastructure and services from Amazon Web Services (AWS), or Microsoft Azure. As a result, what you’re seeing is the technology decisions that are made there require different skill sets and teams to come together to be able to deliver on the DevOps and SecOps processes that we just mentioned.

Gardner: Ramin, it’s important to point out that we’re not just talking about public-facing business-to-consumer (B2C) apps, not that those aren't important, but we’re also talking about all those very important business-to-business (B2B) and business-to-employee (B2E) apps. I can't tell you how frustrating it is when you get on the phone with somebody and they say, “Well, I’ll help you, but my app is down,” or the data isn’t available. So this is not just for the public facing apps, it's all apps, right?

It's a data problem

Sayar: Absolutely. Regardless of whether it's enterprise or consumer, if it's mid-market small and medium business (SMB) or enterprise that you are building these apps for, what we see from our customers is that they all have a similar challenge, and they’re really trying to deal with the volume, the velocity, and the variety of the data around these new architectures and how they grapple and get their hands around it. At the end of day, it becomes a data problem, not just a process or technology problem.

Gardner: Let's talk about the challenges then. If we have many moving parts, if we need to do things faster, if we need to consider the development lifecycle and processes as well as ongoing security, if we’re dealing with outside third-party cloud providers, where do we go to find the common thread of insight, even though we have more complexity across more organizational boundaries?

Sayar: From a Sumo Logic perspective, we’re trying to provide full-stack visibility, not only from code and your repositories like GitHub or Jenkins, but all the way through the components of your code, to API calls, to what your deployment tools are used for in terms of provisioning and performance.

We spend a lot of effort to integrate to the various DevOps tool chain vendors, as well as provide the holistic view of what users are doing in terms of access to those applications and services. We know who has checked in which code or which branch and which build created potential issues for the performance, latency, or outage. So we give you that 360-view by providing that full stack set of capabilities.

Gardner: So, the more information the better, no matter where in the process, no matter where in the lifecycle. But then, that adds its own level of complexity. I wonder is this a fire-hose approach or boiling-the-ocean approach? How do you make that manageable and then actionable?

Sayar: We’ve invested quite a bit of our intellectual property (IP) on not only providing integration with these various sources of data, but also a lot in the machine learningand algorithms, so that we can take advantage of the architecture of being a true cloud native multitenant fast and simple solution.

So, unlike others that are out there and available for you, Sumo Logic's architecture is truly cloud native and multitenant, but it's centered on the principle of near real-time data streaming.

As the data is coming in, our data-streaming engine is allowing developers, IT ops administrators, sys admins, and security professionals to be able to have their own view, coarse-grained or granular-grained, from our back controls that we have in the system to be able to leverage the same data for different purposes, versus having to wait for someone to create a dashboard, create a view, or be able to get access to a system when something breaks.

Gardner: That’s interesting. Having been in the industry long enough, I remember when logs basically meant batch. You'd get a log dump, and then you would do something with it. That would generate a report, many times with manual steps involved. So what's the big step to going to streaming? Why is that an essential part of making this so actionable?

Sayar: It’s driven based on the architectures and the applications. No longer is it acceptable to look at samples of data that span 5 or 15 minutes. You need the real-time data, sub-second, millisecond latency to be able to understand causality, and be able to understand when you’re having a potential threat, risk, or security concern, versus code-quality issues that are causing potential performance outages and therefore business impact.

The old way was hope and pray, when I deployed code, that I would find something when a user complains is no longer acceptable. You lose business and credibility, and at the end of the day, there’s no real way to hold developers, operations folks, or security folks accountable because of the legacy tools and process approach.

Center of the business

Those expectations have changed, because of the consumerization of IT and the fact that apps are the center of the business, as we’ve talked about. What we really do is provide a simple way for us to analyze the metadata coming in and provide very simple access through APIs or through our user interfaces based on your role to be able to address issues proactively.

Conceptually, there’s this notion of wartime and peacetime as we’re building and delivering our service. We look at the problems that users -- customers of Sumo Logic and internally here at Sumo Logic -- are used to and then we break that down into this lifecycle -- centered on this concept of peacetime and wartime.

Peacetime is when nothing is wrong, but you want to stay ahead of issues and you want to be able to proactively assess the health of your service, your application, your operational level agreements, your SLAs, and be notified when something is trending the wrong way.

Then, there's this notion of wartime, and wartime is all hands on deck. Instead of being alerted 15 minutes or an hour after an outage has happened or security risk and threat implication has been discovered, the real-time data-streaming engine is notifying people instantly, and you're getting PagerDuty alerts, you're getting Slack notifications. It's no longer the traditional helpdesk notification process when people are getting on bridge lines.

Because the teams are often distributed and it’s shared responsibility and ownership for identifying an issue in wartime, we're enabling collaboration and new ways of collaboration by leveraging the integrations to things like Slack, PagerDuty notification systems through the real-time platform we've built.

So, the always-on application expectations that customers and consumers have, have now been transformed to always-on available development and security resources to be able to address problems proactively.

Gardner: It sounds like we're able to not only take the data and information in real time from the applications to understand what’s going on with the applications, but we can take that same information and start applying it to other business metrics, other business environmental impacts that then give us an even greater insight into how to manage the business and the processes. Am I overstating that or is that where we are heading here?

Sayar: That’s exactly right. The essence of what we provide in terms of the service is a platform that leverages the machine logs and time-series data from a single platform or service that eliminates a lot of the complexity that exists in traditional processes and tools. No longer do you need to do “swivel-chair” correlation, because we're looking at multiple UIs and tools and products. No longer do you have to wait for the helpdesk person to notify you. We're trying to provide that instant knowledge and collaboration through the real-time data-streaming platform we've built to bring teams together versus divided.

Gardner: That sounds terrific if I'm the IT guy or gal, but why should this be of interest to somebody higher up in the organization, at a business process, even at a C-table level? What is it about continuous intelligence that cannot only help apps run on time and well, but help my business run on time and well?

Need for agility

Sayar: We talked a little bit about the whole need for agility. From a business point of view, the line-of-business folks who are associated with any of these greenfield projects or apps want to be able to increase the cycle times of the application delivery. They want to have measurable results in terms of application changes or web changes, so that their web properties have either increased or potentially decreased in terms of user satisfaction or, at the end of the day, business revenue.

So, we're able to help the developers, the DevOps teams, and ultimately, line of business deliver on the speed and agility needs for these new modes. We do that through a single comprehensive platform, as I mentioned.

At the same time, what’s interesting here is that no longer is security an afterthought. No longer is security in the back room trying to figure out when a threat or an attack has happened. Security has a seat at the table in a lot of boardrooms, and more importantly, in a lot of strategic initiatives for enterprise companies today.

At the same time we're helping with agility, we're also helping with prevention. And so a lot of our customers often start with the security teams that are looking for a new way to be able to inspect this volume of data that’s coming in -- not at the infrastructure level or only the end-user level -- but at the application and code level. What we're really able to do, as I mentioned earlier, is provide a unifying approach to bring these disparate teams together.

Download the State
Of Modern Applications
In AWS Report

Gardner: And yet individuals can extract the intelligence view that best suits what their needs are in that moment.

Sayar: Yes. And ultimately what we're able to do is improve customer experience, increase revenue-generating services, increase efficiencies and agility of actually delivering code that’s quality and therefore the applications, and lastly, improve collaboration and communication.

Gardner: I’d really like to hear some real world examples of how this works, but before we go there, I’m still interested in the how. As to this idea of machine learning, we're hearing an awful lot today about bots, artificial intelligence (AI), and machine learning. Parse this out a bit for me. What is it that you're using machine learningfor when it comes to this volume and variety in understanding apps and making that useable in the context of a business metric of some kind?

Sayar: This is an interesting topic, because of a lot of noise in the market around big data or machine learning and advanced analytics. Since Sumo Logic was started six years ago, we built this platform to ensure that not only we have the best in class security and encryption capabilities, but it was centered on the fundamental purpose around democratizing analytics, making it simpler to be able to allow more than just a subset of folks get access to information for their roles and responsibilities, whether you're security, ops, or development teams.

To answer your question a little bit more succinctly, our platform is predicated on multiple levels of machine learning and analytics capabilities. Starting at the lowest level, something that we refer to as LogReduce is meant to separate the signal-to-noise ratio. Ultimately, it helps a lot of our users and customers reduce mean time to identification by upwards of 90 percent, because they're not searching the irrelevant data. They're searching the relevant and oftentimes occurring data that's not frequent or not really known, versus what’s constantly occurring in their environment.

In doing so, it’s not just about mean time to identification, but it’s also how quickly we're able to respond and repair. We've seen customers using LogReduce reduce the mean time to resolution by upwards of 50 percent.

Predictive capabilities

Our core analytics, at the lowest level, is helping solve operational metrics and value. Then, we start to become less reactive. When you've had an outage or a security threat, you start to leverage some of our other predictive capabilities in our stack.

For example, I mentioned this concept of peacetime and wartime. In the notion of peacetime, you're looking at changes over time when you've deployed code and/or applications to various geographies and locations. A lot of times, developers and ops folks that use Sumo want to use log compare or outlier predictor operators that are in their machine learning capabilities to show and compare differences of branches of code and quality of their code to relevancy around performance and availability of the service and app.

We allow them, with a click of a button, to compare this window for these events and these metrics for the last hour, last day, last week, last month, and compare them to other time slices of data and show how much better or worse it is. This is before deploying to production. When they look at production, we're able to allow them to use predictive analytics to look at anomalies and abnormal behavior to get more proactive.

So, reactive, to proactive, all the way to predictive is the philosophy that we've been trying to build in terms of our analytics stack and capabilities.

Gardner: How are some actual customers using this and what are they getting back for their investment?

Sayar: We have customers that span retail and e-commerce, high-tech, media, entertainment, travel, and insurance. We're well north of 1,200 unique paying customers, and they span anyone from Airbnb, Anheuser-Busch, Adobe, Metadata, Marriott, Twitter, Telstra, Xora -- modern companies as well as traditional companies.

What do they all have in common? Often, what we see is a digital transformation project or initiative. They either have to build greenfield or brownfield apps and they need a new approach and a new service, and that's where they start leveraging Sumo Logic.

Second, what we see is that's it’s not always a digital transformation; it's often a cost reduction and/or a consolidation project. Consolidation could be tools or infrastructure and data center, or it could be migration to co-los or public-cloud infrastructures.

The nice thing about Sumo Logic is that we can connect anything from your top of rack switch, to your discrete storage arrays, to network devices, to operating system, and middleware, through to your content-delivery network (CDN) providers and your public-cloud infrastructures.

As it’s a migration or consolidation project, we’re able to help them compare performance and availability, SLAs that they have associated with those, as well as differences in terms of delivery of infrastructure services to the developers or users.

So whether it's agility-driven or cost-driven, Sumo Logic is very relevant for all these customers that are spanning the data-center infrastructure consolidation to new workload projects that they may be building in private-cloud or public-cloud endpoints.

Gardner: Ramin, how about a couple of concrete examples of what you were just referring to.

Cloud migration

Sayar: One good example is in the media space or media and entertainment space, for example, Hearst Media. They, like a lot of our other customers, were undergoing a digital-transformation project and a cloud-migration project. They were moving about 36 apps to AWS and they needed a single platform that provided machine-learning analytics to be able to recognize and quickly identify performance issues prior to making the migration and updates to any of the apps rolling over to AWS. They were able to really improve cycle times, as well as efficiency, with respect to identifying and resolving issues fast.

Another example would be JetBlue. We do a lot in the travel space. JetBlue is also another AWS and cloud customer. They provide a lot of in-flight entertainment to their customers. They wanted to be able to look at the service quality for the revenue model for the in-flight entertainment system and be able to ascertain what movies are being watched, what’s the quality of service, whether that’s being degraded or having to charge customers more than once for any type of service outages. That’s how they're using Sumo Logic to better assess and manage customer experience. It's not too dissimilar from Alaska Airlines or others that are also providing in-flight notification and wireless type of services.

The last one is someone that we're all pretty familiar with and that’s Airbnb. We're seeing a fundamental disruption in the travel space and how we reserve hotels or apartments or homes, and Airbnb has led the charge, like Uber in the transportation space. In their case, they're taking a lot of credit-card and payment-processing information. They're using Sumo Logic for payment-card industry (PCI) audit and security, as well as operational visibility in terms of their websites and presence.

Gardner: It’s interesting. Not only are you giving them benefits along insight lines, but it sounds to me like you're giving them a green light to go ahead and experiment and then learn very quickly whether that experiment worked or not, so that they can find refine. That’s so important in our digital business and agility drive these days.

Sayar: Absolutely. And if I were to think of another interesting example, Anheuser-Busch is another one of our customers. In this case, the CISO wanted to have a new approach to security and not one that was centered on guarding the data and access to the data, but providing a single platform for all constituents within Anheuser-Busch, whether security teams, operations teams, developers, or support teams.

We did a pilot for them, and as they're modernizing a lot of their apps, as they start to look at the next generation of security analytics, the adoption of Sumo started to become instant inside AB InBev. Now, they're looking at not just their existing real estate of infrastructure and apps for all these teams, but they're going to connect it to future projects such as the Connected Path, so they can understand what the yield is from each pour in a particular keg in a location and figure out whether that’s optimized or when they can replace the keg.

So, you're going from a reactive approach for security and processes around deployment and operations to next-gen connected Internet of Things (IoT) and devices to understand business performance and yield. That's a great example of an innovative company doing something unique and different with Sumo Logic.

Gardner: So, what happens as these companies modernize and they start to avail themselves of more public-cloud infrastructure services, ultimately more-and-more of their apps are going to be of, by, and for somebody else’s public cloud? Where do you fit in that scenario?

Data source and location

Sayar: Whether you’re running on-prem, whether you're running co-los, whether you're running through CDN providers like Akamai, whether you're running on AWS or Azure, Heroku, whether you're running SaaS platforms and renting a single platform that can manage and ingest all that data for you. Interestingly enough, about half our customers’ workloads run on-premises and half of them run in the cloud.

We’re agnostic to where the data is or where their applications or workloads reside. The benefit we provide is the single ubiquitous platform for managing the data streams that are coming in from devices, from applications, from infrastructure, from mobile to you, in a simple, real-time way through a multitenant cloud service.

Gardner: This reminds me of what I heard, 10 or 15 years ago about business intelligence (BI), drawing data, analyzing it, making it close to being proactive in its ability to help the organization. How is continuous intelligence different, or even better, and something that would replace what we refer to as BI?

Sayar: The issue that we faced with the first generation of BI was it was very rear-view and mirror-centric, meaning that it was looking at data and things in the past. Where we're at today with this need for speed and the necessity to be always on, always available, the expectation is that it’s sub-millisecond latency to understand what's going on, from a security, operational, or user-experience point of view.

I'd say that we're on V2 or next generation of what was traditionally called BI, and we refer to that as continuous intelligence, because you're continuously adapting and learning. It's not only based on what humans know and what rules and correlation that they try to presuppose and create alarms and filters and things around that. It’s what machines and machine intelligence needs to supplement that with to provide the best-in-class type of capability, which is what we refer to as continuous intelligence.

Gardner: We’re almost out of time, but I wanted to look to the future a little bit. Obviously, there's a lot of investing going on now around big data and analytics as it pertains to many different elements of many different businesses, depending on their verticals. Then, we're talking about some of the logic benefit and continuous intelligence as it applies to applications and their lifecycle.

Where do we start to see crossover between those? How do I leverage what I’m doing in big data generally in my organization and more specifically, what I can do with continuous intelligence from my systems, from my applications?

Business Insights

Sayar: We touched a little bit on that in terms of the types of data that we integrate and ingest. At the end of the day, when we talk about full-stack visibility, it's from everything with respect to providing business insights to operational insights, to security insights.

We have some customers that are in credit-card payment processing, and they actually use us to understand activations for credit cards, so they're extracting value from the data coming into Sumo Logic to understand and predict business impact and relevant revenue associated with these services that they're managing; in this case, a set of apps that run on a CDN.

Try Sumo Logic for Free
To Get Critical Data and Insights
Into Apps and Infrastructure Operations

At the same time, the fraud and risk team are using us for threat and prevention. The operations team is using us for understanding identification of issues proactively to be able to address any application or infrastructure issues, and that’s what we refer to as full stack.

Full stack isn’t just the technology; it's providing business visibility insights to line the business users or users that are looking at metrics around user experience and service quality, to operational-level impacts that help you become more proactive, or in some cases, reactive to wartime issues, as we've talked about. And lastly, the security team helps you take a different security posture around reactive and proactive, around threat, detection, and risk.

In a nutshell, where we see these things starting to converge is what we refer to as full stack visibility around our strategy for continuous intelligence, and that is technology to business to users.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript ordownload a copy. Sponsor: Sumo Logic.

You may also be interested in:

The Biggest HPE 3PAR StoreServ Announcement in Years!

The Biggest HPE 3PAR StoreServ Announcement in Years!

Today is the biggest HPE 3PAR StoreServ announcement since we launched the 7000 family over four years ago. I’ll be talking about a new 3PAR OS and a lot of new features. Today, I'll summarize these features – and you can watch my podcast below for more the announcement news. Over the next several days, our subject-matter experts will dive deep into the highlights of the announcement.

OCSL sets its sights on the Nirvana of hybrid IT—attaining the right mix of hybrid cloud for its clients

The next BriefingsDirect digital transformation case study explores how UK IT consultancy OCSL has set its sights on the holy grail of hybrid IT -- helping its clients to find and attain the right mix of hybrid cloud.

We'll now explore how each enterprise -- and perhaps even units within each enterprise -- determines the path to a proper mix of public and private cloud. Closer to home, they're looking at the proper fit of converged infrastructure, hyper-converged infrastructure (HCI), and software-defined data center (SDDC) platforms.

Implementing such a services-attuned architecture may be the most viable means to dynamically apportion applications and data support among and between cloud and on-premises deployments.

Listen to the podcast. Find it on iTunes. Get the mobile app. Download the transcript. Sponsor: Hewlett Packard Enterprise.

 

To describe how to rationalize the right mix of hybrid cloud and hybrid IT services along with infrastructure choices on-premises, we are joined by Mark Skelton, Head of Consultancy at OCSL in London. The discussion is moderated by BriefingsDirect's Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: People increasingly want to have some IT on premises, and they want public cloud -- with some available continuum between them. But deciding the right mix is difficult and probably something that’s going to change over time. What drivers are you seeing now as organizations make this determination?

Accelerate Your Business
With Hybrid Cloud from HPE
Learn More

Skelton: It’s a blend of lot of things. We've been working with enterprises for a long time on their hybrid and cloud messaging. Our clients have been struggling just to understand what hybrid really means, but also how we make hybrid a reality, and how to get started, because it really is a minefield. You look at what Microsoft is doing, what AWS is doing, and what HPE is doing in their technologies. There's so much out there. How do they get started?

We've been struggling in the last 18 months to get customers on that journey and get started. But now, because technology is advancing, we're seeing customers starting to embrace it and starting to evolve and transform into those things. And, we've matured our models and frameworks as well to help customer adoption.

Gardner: Do you see the rationale for hybrid IT shaking down to an economic equation? Is it to try to take advantage of technologies that are available? Is it about compliance and security? You're probably temped to say all of the above, but I'm looking for what's driving the top-of-mind decision-making now.

Start with the economics

Skelton: The initial decision-making process begins with the economics. I think everyone has bought into the marketing messages from the public cloud providers saying, "We can reduce your costs, we can reduce your overhead -- and not just from a culture perspective, but from management, from personal perspective, and from a technology solutions perspective."

 

 

CIOs, and even financial officers, are seeing economics as the tipping point they need to go into a hybrid cloud, or even all into a public cloud. But it’s not always cheap to put everything into a public cloud. When we look at business cases with clients, it’s the long-term investment we look at. Over time, it’s not always cheap to put things into public cloud. That’s where hybrid started to come back into the front of people’s minds.

 

We can use public cloud for the right workloads and where they want to be flexible and burst and be a bit more agile or even give global reach to long global businesses, but then keep the crown jewels back inside secured data centers where they're known and trusted and closer to some of the key, critical systems.

 

So, it starts with the finance side of the things, but quickly evolves beyond that, and financial decisions aren't the only reasons why people are going to public or hybrid cloud.

Gardner: In a more perfect world, we'd be able to move things back and forth with ease and simplicity, where we could take the A/B testing-type of approach to a public and private cloud decision. We're not quite there yet, but do you see a day where that choice about public and private will be dynamic -- and perhaps among multiple clouds or multi-cloud hybrid environment?

Skelton: Absolutely. I think multi-cloud is the Nirvana for every organization, just because there isn't one-size-fits-all for every type of work. We've been talking about it for quite a long time. The technology hasn't really been there to underpin multi-cloud and truly make it easy to move on-premises to public or vice versa. But I think now we're getting there with technology.

Are we there yet? No, there are still a few big releases coming, things that we're waiting to be released to market, which will help simplify that multi-cloud and the ability to migrate up and back, but we're just not there yet, in my opinion.

Gardner: We might be tempted to break this out between applications and data. Application workloads might be a bit more flexible across a continuum of hybrid cloud, but other considerations are brought to the data. That can be security, regulation, control, compliance, data sovereignty, GDPR, and so forth. Are you seeing your customers looking at this divide between applications and data, and how they are able to rationalize one versus the other?

Sketon: Applications, as you have just mentioned, are the simpler things to move into a cloud model, but the data is really the crown jewels of the business, and people are nervous about putting that into public cloud. So what we're seeing lot of is putting applications into the public cloud for the agility, elasticity, and global reach and trying to keep data on-premises because they're nervous about those breaches in the service providers’ data centers.

That's what we are seeing, but we are seeing an uprising of things like object storage, so we're working with Scality, for example, and they have a unique solution for blending public and on-premises solutions, so we can pin things to certain platforms in a secure data center and then, where the data is not quite critical, move it into a public cloud environment.

Gardner: It sounds like you've been quite busy. Please tell us about OCSL, an overview of your company and where you're focusing most of your efforts in terms of hybrid computing.

Rebrand and refresh

Skelton: OCSL had been around for 26 years as a business. Recently, we've been through a re-brand and a refresh of what we are focusing on, and we're moving more to a services organization, leading with our people and our consultants.

We're focusing on transforming customers and clients into the cloud environment, whether that's applications or, if it's data center, cloud, or hybrid cloud. We're trying to get customers on that journey of transformation and engaging with business-level people and business requirements and working out how we make cloud a reality, rather than just saying there's a product and you go and do whatever you want with it. We're finding out what those businesses want, what are the key requirements, and then finding the right cloud models that to fit that.

Gardner: So many organizations are facing not just a retrofit or a rethinking around IT, but truly a digital transformation for the entire organization. There are many cases of sloughing off business lines, and other cases of acquiring. It's an interesting time in terms of a mass reconfiguration of businesses and how they identify themselves.

Skelton: What's changed for me is, when I go and speak to a customer, I'm no longer just speaking to the IT guys, I'm actually engaging with the finance officers, the marketing officers, the digital officers -- that's he common one that is creeping up now. And it's a very different conversation.

Accelerate Your Business
With Hybrid Cloud from HPE
Learn More

We're looking at business outcomes now, rather than focusing on, "I need this disk, this product." It's more: "I need to deliver this service back to the business." That's how we're changing as a business. It's doing that business consultancy, engaging with that, and then finding the right solutions to fit requirements and truly transform the business.

Gardner: Of course, HPE has been going through transformations itself for the past several years, and that doesn't seem to be slowing up much. Tell us about the alliance between OCSL and HPE. How do you come together as a whole greater than the sum of the parts?

Skelton: HPE is transforming and becoming a more agile organization, with some of the spinoffs that we've had recently aiding that agility. OCSL has worked in partnership with HPE for many years, and it's all about going to market together and working together to engage with the customers at right level and find the right solutions. We've had great success with that over many years.

Gardner: Now, let’s go to the "show rather than tell" part of our discussion. Are there some examples that you can look to, clients that you work with, that have progressed through a transition to hybrid computing, hybrid cloud, and enjoyed certain benefits or found unintended consequences that we can learn from?

Skelton: We've had a lot of successes in the last 12 months as I'm taking clients on the journey to hybrid cloud. One of the key ones that resonates with me is a legal firm that we've been working with. They were in a bit of a state. They had an infrastructure that was aging, was unstable, and wasn't delivering quality service back to the lawyers that were trying to embrace technology -- so mobile devices, dictation software, those kind of things.

We came in with a first prospectus on how we would actually address some of those problems. We challenged them, and said that we need to go through a stabilization phase. Public cloud is not going to be the immediate answer. They're being courted by the big vendors, as everyone is, about public cloud and they were saying it was the Nirvana for them.

We challenged that and we got them to a stable platform first, built on HPE hardware. We got instant stability for them. So, the business saw immediate returns and delivery of service. It’s all about getting that impactful thing back to the business, first and foremost.

Building cloud model

Now, we're working through each of their service lines, looking at how we can break them up and transform them into a cloud model. That involves breaking down those apps, deconstructing the apps, and thinking about how we can use pockets of public cloud in line with the hybrid on-premise in our data-center infrastructure.

They've now started to see real innovative solutions taking that business forward, but they got instant stability.

Gardner: Were there any situations where organizations were very high-minded and fanciful about what they were going to get from cloud that may have led to some disappointment -- so unintended consequences. Maybe others might benefit from hindsight. What do you look out for, now that you have been doing this for a while in terms of hybrid cloud adoption?

Skelton: One of the things I've seen a lot of with cloud is that people have bought into the messaging from the big public cloud vendors about how they can just turn on services and keep consuming, consuming, consuming. A lot of people have gotten themselves into a state where bills have been rising and rising, and the economics are looking ridiculous. The finance officers are now coming back and saying they need to rein that back in. How do they put some control around that?

That’s where hybrid is helping, because if you start to hook up some workloads back in an isolated data center, you start to move some of those workloads back. But the key for me is that it comes down to putting some thought process into what you're putting into cloud. Just think through to how can you transform and use the services properly. Don't just turn everything on, because it’s there and it’s click of a button away, but actually think about put some design and planning into adopting cloud.

Gardner: It also sounds like the IT people might need to go out and have a pint with the procurement people and learn a few basics about good contract writing, terms and conditions, and putting in clauses that allow you to back out, if needed. Is that something that we should be mindful of -- IT being in the procurement mode as well as specifying technology mode?

Skelton: Procurement definitely needs to be involved in the initial set-up with the cloudwhenever they're committing to a consumption number, but then once that’s done, it’s IT’s responsibility in terms of how they are consuming that. Procurement needs to be involved all the way through in keeping constant track of what’s going on; and that’s not happening.

The IT guys don’t really care about the cost; they care about the widgets and turning things on and playing around that. I don’t think they really realized how much this is going to cost-back. So yeah, there is a bit of disjoint in lots of organizations in terms of procurement in the upfront piece, and then it goes away, and then IT comes in and spends all of the money.

Gardner: In the complex service delivery environment, that procurement function probably should be constant and vigilant.

Big change in procurement

Skelton: Procurement departments are going to change. We're starting to see that in some of the bigger organizations. They're closer to the IT departments. They need to understand that technology and what’s being used, but that’s quite rare at the moment. I think that probably over the next 12 months, that’s going to be a big change in the larger organizations.

Gardner: Before we close, let's take a look to the future. A year or two from now, if we sit down again, I imagine that more micro services will be involved and containerization will have an effect, where the complexity of services and what we even think of as an application could be quite different, more of an API-driven environment perhaps.

So the complexity about managing your cloud and hybrid cloud to find the right mix, and pricing that, and being vigilant about whether you're getting your money’s worth or not, seems to be something where we should start thinking about applying artificial intelligence (AI), machine learning, what I like to call BotOps, something that is going to be there for you automatically without human intervention.

Does that sound on track to you, and do you think that we need to start looking to advanced automation and even AI-driven automation to manage this complex divide between organizations and cloud providers?

Skelton: You hit a lot of key points there in terms of where the future is going. I think we are still in this phase if we start trying to build the right platforms to be ready for the future. So we see the recent releases of HPE Synergy for example, being able to support these modern platforms, and that’s really allowing us to then embrace things like micro services. Docker and Mesosphere are two types of platforms that will disrupt organizations and the way we do things, but you need to find the right platform first.

Hopefully, in 12 months, we can have those platforms and we can then start to embrace some of this great new technology and really rethink our applications. And it’s a challenge to the ISPs. They've got to work out how they can take advantage of some of these technologies.

Accelerate Your Business
With Hybrid Cloud from HPE
Learn More

We're seeing a lot of talk about Cervalis and computing. It's where there is nothing and you need to spin up results as and when you need to. The classic use case for that is Uber; and they have built a whole business on that Cervalis type model. I think that in 12 months time, we're going to see a lot more of that and more of the enterprise type organizations.

I don’t think we have it quite clear in our minds how we're going to embrace that but it’s the ISV community that really needs to start driving that. Beyond that, it's absolutely with AI and bots. We're all going to be talking to computers, and they're going to be responding with very human sorts of reactions. That's the next way.

I am bringing that into enterprise organizations for how we can solve some business challenges. Service test management is one of the use cases where we're seeing, in some of our clients, whether they can get immediate response from bots and things like that to common queries, so they don’t need as many support staff. It’s already starting to happen.

Listen to the podcast. Find it on iTunes. Get the mobile app. Download the transcript. Sponsor: Hewlett Packard Enterprise.

You may also be interested in:

Fast acquisition of diverse unstructured data sources makes IDOL API tools a star at LogitBot

The next BriefingsDirect Voice of the Customer digital transformation case study highlights how high-performing big-data analysis powers an innovative artificial intelligence (AI)-based investment opportunity and evaluation tool. We'll learn how LogitBot in New York identifies, manages, and contextually categorizes truly massive and diverse data sources.

By leveraging entity recognition APIs, LogitBot not only provides investment evaluations from across these data sets, it delivers the analysis as natural-language information directly into spreadsheets as the delivery endpoint. This is a prime example of how complex cloud-to core-to edge processes and benefits can be managed and exploited using the most responsive big-data APIs and services.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. 

To describe how a virtual assistant for targeting investment opportunities is being supported by cloud-based big-data services, we're joined by Mutisya Ndunda, Founder and CEO of LogitBot and Michael Bishop, CTO of LogicBot, in New York. The discussion is moderated by BriefingsDirect's Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: Let’s look at some of the trends driving your need to do what you're doing with AI and bots, bringing together data, and then delivering it in the format that people want most. What’s the driver in the market for doing this?

Ndunda: LogitBot is all about trying to eliminate friction between people who have very high-value jobs and some of the more mundane things that could be automated by AI.

 

Today, in finance, the industry, in general, searches for investment opportunities using techniques that have been around for over 30 years. What tends to happen is that the people who are doing this should be spending more time on strategic thinking, ideation, and managing risk. But without AI tools, they tend to get bogged down in the data and in the day-to-day. So, we've decided to help them tackle that problem.

Gardner: Let the machines do what the machines do best. But how do we decide where the demarcation is between what the machines do well and what the people do well, Michael?

Bishop: We believe in empowering the user and not replacing the user. So, the machine is able to go in-depth and do what a high-performing analyst or researcher would do at scale, and it does that every day, instead of once a quarter, for instance, when research analysts would revisit an equity or a sector. We can do that constantly, react to events as they happen, and replicate what a high-performing analyst is able to do.

Gardner: It’s interesting to me that you're not only taking a vast amount of data and putting it into a useful format and qualitative type, but you're delivering it in a way that’s demanded in the market, that people want and use. Tell me about this core value and then the edge value and how you came to decide on doing it the way you do?

Evolutionary process

Ndunda: It’s an evolutionary process that we've embarked on or are going through. The industry is very used to doing things in a very specific way, and AI isn't something that a lot of people are necessarily familiar within financial services. We decided to wrap it around things that are extremely intuitive to an end user who doesn't have the time to learn technology.

So, we said that we'll try to leverage as many things as possible in the back via APIs and all kinds of other things, but the delivery mechanism in the front needs to be as simple or as friction-less as possible to the end-user. That’s our core principle.

Humanization of Machine Learning
For Big Data Success
Learn More

Bishop: Finance professionals generally don't like black boxes and mystery, and obviously, when you're dealing with money, you don’t want to get an answer out of a machine you can’t understand. Even though we're crunching a lot of information andmaking a lot of inferences, at the end of the day, they could unwind it themselves if they wanted to verify the inferences that we have made.

 

We're wrapping up an incredibly complicated amount of information, but it still makes sense at the end of the day. It’s still intuitive to someone. There's not a sense that this is voodoo under the covers.

Gardner: Well, let’s pause there. We'll go back to the data issues and the user-experience issues, but tell us about LogitBot. You're a startup, you're in New York, and you're focused on Wall Street. Tell us how you came to be and what you do, in a more general sense.

Ndunda: Our professional background has always been in financial services. Personally, I've spent over 15 years in financial services, and my career led me to what I'm doing today.

In the 2006-2007 timeframe, I left Merrill Lynch to join a large proprietary market-making business called Susquehanna International Group. They're one of the largest providers of liquidity around the world. Chances are whenever you buy or sell a stock, you're buying from or selling to Susquehanna or one of its competitors.

What had happened in that industry was that people were embracing technology, but it was algorithmic trading, what has become known today as high-frequency trading. At Susquehanna, we resisted that notion, because we said machines don't necessarily make decisions well, and this was before AI had been born.

Internally, we went through this period where we had a lot of discussions around, are we losing out to the competition, should we really go pure bot, more or less? Then, 2008 hit and our intuition of allowing our traders to focus on the risky things and then setting up machines to trade riskless or small orders paid off a lot for the firm; it was the best year the firm ever had, when everyone else was falling apart.

That was the first piece that got me to understand or to start thinking about how you can empower people and financial professionals to do what they really do well and then not get bogged down in the details.

Then, I joined Bloomberg and I spent five years there as the head of strategy and business development. The company has an amazing business, but it's built around the notion of static data. What had happened in that business was that, over a period of time, we began to see the marketplace valuing analytics more and more.

Make a distinction

Part of the role that I was brought in to do was to help them unwind that and decouple the two things -- to make a distinction within the company about static information versus analytical or valuable information. The trend that we saw was that hedge funds, especially the ones that were employing systematic investment strategies, were beginning to do two things, to embrace AI or technology to empower your traders and then also look deeper into analytics versus static data.

That was what brought me to LogitBot. I thought we could do it really well, because the players themselves don't have the time to do it and some of the vendors are very stuck in their traditional business models.

Bishop: We're seeing a kind of renaissance here, or we're at a pivotal moment, where we're moving away from analytics in the sense of business reporting tools or understanding yesterday. We're now able to mine data, get insightful, actionable information out of it, and then move into predictive analytics. And it's not just statistical correlations. I don’t want to offend any quants, but a lot of technology [to further analyze information] has come online recently, and more is coming online every day.

For us, Google had released TensorFlow, and that made a substantial difference in our ability to reason about natural language. Had it not been for that, it would have been very difficult one year ago.

At the moment, technology is really taking off in a lot of areas at once. That enabled us to move from static analysis of what's happened in the past and move to insightful and actionable information.

Ndunda: What Michael kind of touched on there is really important. A lot of traditional ways of looking at financial investment opportunities is to say that historically, this has happened. So, history should repeat itself. We're in markets where nothing that's happening today has really happened in the past. So, relying on a backward-looking mechanism of trying to interpret the future is kind of really dangerous, versus having a more grounded approach that can actually incorporate things that are nontraditional in many different ways.

So, unstructured data, what investors are thinking, what central bankers are saying, all of those are really important inputs, one part of any model 10 or 20 years ago. Without machine learning and some of the things that we are doing today, it’s very difficult to incorporate any of that and make sense of it in a structured way.

Gardner: So, if the goal is to make outlier events your friend and not your enemy, what data do you go to to close the gap between what's happened and what the reaction should be, and how do you best get that data and make it manageable for your AI and machine-learning capabilities to exploit?

Ndunda: Michael can probably add to this as well. We do not discriminate as far as data goes. What we like to do is have no opinion on data ahead of time. We want to get as much information as possible and then let a scientific process lead us to decide what data is actually useful for the task that we want to deploy it on.

As an example, we're very opportunistic about acquiring information about who the most important people at companies are and how they're connected to each other. Does this guy work on a board with this or how do they know each other? It may not have any application at that very moment, but over the course of time, you end up building models that are actually really interesting.

We scan over 70,000 financial news sources. We capture news information across the world. We don't necessarily use all of that information on a day-to-day basis, but at least we have it and we can decide how to use it in the future.

We also monitor anything that companies file and what management teams talk about at investor conferences or on phone conversations with investors.

Bishop: Conference calls, videos, interviews.
Audio to text

Ndunda: HPE has a really interesting technology that they have recently put out. You can transcribe audio to text, and then we can apply our text processing on top of that to understand what management is saying in a structural, machine-based way. Instead of 50 people listening to 50 conference calls you could just have a machine do it for you.

Gardner: Something we can do there that we couldn't have done before is that you can also apply something like sentiment analysis, which you couldn’t have done if it was a document, and that can be very valuable.

Bishop: Yes, even tonal analysis. There are a few theories on that, that may or may not pan out, but there are studies around tone and cadence. We're looking at it and we will see if it actually pans out.

Gardner: And so do you put this all into your own on-premises data-center warehouse or do you take advantage of cloud in a variety of different means by which to corral and then analyze this data? How do you take this fire hose and make it manageable?

Bishop: We do take advantage of the cloud quite aggressively. We're split between SoftLayer and Google. At SoftLayer we have bare-metal hardware machines and some power machines with high-power GPUs.

Humanization of Machine Learning
For Big Data Success
Learn More

On the Google side, we take advantage of Bigtable and BigQuery and some of their infrastructure tools. And we have good, old PostgreSQL in there, as well as DataStax, Cassandra, and their Graph as the graph engine. We make liberal use of HPE Haven APIs as well and TensorFlow, as I mentioned before. So, it’s a smorgasbord of things you need to corral in order to get the job done. We found it very hard to find all of that wrapped in a bow with one provider.

We're big proponents of Kubernetes and Docker as well, and we leverage that to avoid lock-in where we can. Our workload can migrate between Google and the SoftLayer Kubernetes cluster. So, we can migrate between hardware or virtual machines (VMs), depending on the horsepower that’s needed at the moment. That's how we handle it.

Gardner: So, maybe 10 years ago you would have been in a systems-integration capacity, but now you're in a services-integration capacity. You're doing some very powerful things at a clip and probably at a cost that would have been impossible before.

Bishop: I certainly remember placing an order for a server, waiting six months, and then setting up the RAID drives. It's amazing that you can just flick a switch and you get a very high-powered machine that would have taken six months to order previously. In Google, you spin up a VM in seconds. Again, that's of a horsepower that would have taken six months to get.

Gardner: So, unprecedented innovation is now at our fingertips when it comes to the IT side of things, unprecedented machine intelligence, now that the algorithms and APIs are driving the opportunity to take advantage of that data.

Let's go back to thinking about what you're outputting and who uses that. Is the investment result that you're generating something that goes to a retail type of investor? Is this something you're selling to investment houses or a still undetermined market? How do you bring this to market?

Natural language interface

Ndunda: Roboto, which is the natural-language interface into our analytical tools, can be custom tailored to respond, based on the user's level of financial sophistication.

At present, we're trying them out on a semiprofessional investment platform, where people are professional traders, but not part of a major brokerage house. They obviously want to get trade ideas, they want to do analytics, and they're a little bit more sophisticated than people who are looking at investments for their retirement account.  Rob can be tailored for that specific use case.

He can also respond to somebody who is managing a portfolio at a hedge fund. The level of depth that he needs to consider is the only differential between those two things.

In the back, he may do an extra five steps if the person asking the question worked at a hedge fund, versus if the person was just asking about why is Apple up today. If you're a retail investor, you don’t want to do a lot of in-depth analysis.

Bishop: You couldn’t take the app and do anything with it or understand it.

Ndunda: Rob is an interface, but the analytics are available via multiple venues. So, you can access the same analytics via an API, a chat interface, the web, or a feed that streams into you. It just depends on how your systems are set up within your organization. But, the data always will be available to you.

Gardner: Going out to that edge equation, that user experience, we've talked about how you deliver this to the endpoints, customary spreadsheets, cells, pivots, whatever. But it also sounds like you are going toward more natural language, so that you could query, rather than a deep SQL environment, like what we get with a Siri or the Amazon Echo. Is that where we're heading?

Bishop: When we started this, trying to parameterize everything that you could ask into enough checkboxes and forums pollutes the screen. The system has access to an enormous amount of data that you can't create a parameterized screen for. We found it was a bit of a breakthrough when we were able to start using natural language.

TensorFlow made a huge difference here in natural language understanding, understanding the intent of the questioner, and being able to parameterize a query from that. If our initial findings here pan out or continue to pan out, it's going to be a very powerful interface.

I can't imagine having to go back to a SQL query if you're able to do it natural language, and it really pans out this time, because we’ve had a few turns of the handle of alleged natural-language querying.

Gardner: And always a moving target. Tell us specifically about SentryWatch and Precog. How do these shake out in terms of your go-to-market strategy?
How everything relates

Ndunda: One of the things that we have to do to be able to answer a lot of questions that our customers may have is to monitor financial markets and what's impacting them on a continuous basis. SentryWatch is literally a byproduct of that process where, because we're monitoring over 70,000 financial news sources, we're analyzing the sentiment, we're doing deep text analysis on it, we're identifying entities and how they're related to each other, in all of these news events, and we're sticking that into a knowledge graph of how everything relates to everything else.

It ends up being a really valuable tool, not only for us, but for other people, because while we're building models. there are also a lot of hedge funds that have proprietary models or proprietary processes that could benefit from that very same organized relational data store of news. That's what SentryWatch is and that's how it's evolved. It started off with something that we were doing as an import and it's actually now a valuable output or a standalone product.

Precog is a way for us to showcase the ability of a machine to be predictive and not be backward looking. Again, when people are making investment decisions or allocation of capital across different investment opportunities, you really care about your forward return on your investments. If I invested a dollar today, am I likely to make 20 cents in profit tomorrow or 30 cents in profit tomorrow?

We're using pretty sophisticated machine-learning models that can take into account unstructured data sources as part of the modeling process. That will give you these forward expectations about stock returns in a very easy-to-use format, where you don't need to have a PhD in physics or mathematics.

You just ask, "What is the likely return of Apple over the next six months," taking into account what's going on in the economy.  Apple was fined $14 billion. That can be quickly added into a model and reflect a new view in a matter of seconds versus sitting down in a spreadsheet and trying to figure out how it all works out.

Gardner: Even for Apple, that's a chunk of change.

Bishop: It's a lot money, and you can imagine that there were quite a few analysts on Wall Street in Excel, updating their models around this so that they could have an answer by the end of the day, where we already had an answer.

Gardner: How do the HPE Haven OnDemand APIs help the Precog when it comes to deciding those sources, getting them in the right format, so that you can exploit?

Ndunda: The beauty of the platform is that it simplifies a lot of development processes that an organization of our size would have to take on themselves.

The nice thing about it is that a drag-and-drop interface is really intuitive; you don't need to be specialized in Java, Python, or whatever it is. You can set up your intent in a graphical way, and then test it out, build it, and expand it as you go along. The Lego-block structure is really useful, because if you want to try things out, it's drag and drop, connect the dots, and then see what you get on the other end.

For us, that's an innovation that we haven't seen with anybody else in the marketplace and it cuts development time for us significantly.

Gardner: Michael, anything more to add on how this makes your life a little easier?

Lowering cost

Bishop: For us, lowering the cost in time to run an experiment is very important when you're running a lot of experiments, and the Combinations product enables us to run a lot of varied experiments using a variety of the HPE Haven APIs in different combinations very quickly. You're able to get your development time down from a week, two weeks, whatever it is to wire up an API to assist them.

In the same amount of time, you're able to wire the initial connection and then you have access to pretty much everything in Haven. You turn it over to either a business user, a data scientist, or a machine-learning person, and they can drag and drop the connectors themselves. It makes my life easier and it makes the developers’ lives easier because it gets back time for us.

Gardner: So, not only have we been able to democratize the querying, moving from SQL to natural language, for example, but we’re also democratizing the choice on sources and combinations of sources in real time, more or less for different types of analyses, not just the query, but the actual source of the data.

Bishop: Correct.

Ndunda: Again, the power of a lot of this stuff is in the unstructured world, because valuable information typically tends to be hidden in documents. In the past, you'd have to have a team of people to scour through text, extract what they thought was valuable, and summarize it for you. You could miss out on 90 percent of the other valuable stuff that's in the document.

With this ability now to drag and drop and then go through a document in five different iterations by just tweaking, a parameter is really useful.

Gardner: So those will be IDOL-backed APIs that you are referring to.

Ndunda: Exactly.

Bishop: It’s something that would be hard for an investment bank, even a few years ago, to process. Everyone is on the same playing field here or starting from the same base, but dealing with unstructured data has been traditionally a very difficult problem. You have a lot technologies coming online as APIs; at the same time, they're also coming out as traditional on-premises [software and appliance] solutions.

Humanization of Machine Learning
For Big Data Success
Learn More

We're all starting from the same gate here. Some folks are little ahead, but I'd say that Facebook is further ahead than an investment bank in their ability to reason over unstructured data. In our world, I feel like we're starting basically at the same place that Goldman or Morgan would be.

Gardner: It's a very interesting reset that we’re going through. It's also interesting that we talked earlier about the divide between where the machine and the individual knowledge worker begins or ends, and that's going to be a moving target. Do you have any sense of how that changes its characterization of what the right combination is of machine intelligence and the best of human intelligence?

Empowering humans

Ndunda: I don’t foresee machines replacing humans, per se. I see them empowering humans, and to the extent that your role is not completely based on a task, if it's based on something where you actually manage a process that goes from one end to another, those particular positions will be there, and the machines will free our people to focus on that.

But, in the case where you have somebody who is really responsible for something that can be automated, then obviously that will go away. Machines don't eat, they don’t need to take vacation, and if it’s a task where you don't need to reason about it, obviously you can have a computer do it.

What we're seeing now is that if you have a machine sitting side by side with a human, and the machine can pick up on how the human reasons with some of the new technologies, then the machine can do a lot of the grunt work, and I think that’s the future of all of this stuff.

Bishop: What we're delivering is that we distill a lot of information, so that a knowledge worker or decision-maker can make an informed decision, instead of watching CNBC and being a single-source reader. We can go out and scour the best of all the information, distill it down, and present it, and they can choose to act on it.

Our goal here is not to make the next jump and make the decision. Our job is to present the information to a decision-maker.
Gardner: It certainly seems to me that the organization, big or small, retail or commercial, can make the best use of this technology. Machine learning, in the end, will win.

Ndunda: Absolutely. It is a transformational technology, because for the first time in a really long time, the reasoning piece of it is within grasp of machines. These machines can operate in the gray area, which is where the world lives.

Gardner: And that gray area can almost have unlimited variables applied to it.

Ndunda: Exactly. Correct.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: Hewlett Packard Enterprise.

You may also be interested in:

How Lastminute.com Uses Machine Learning to Advance the Use of Big Data for Real-time Travel Booking

How Lastminute.com Uses Machine Learning to Advance the Use of Big Data for Real-time Travel Booking

The next BriefingsDirect Voice of the Customer digital transformation case study highlights how online travel and events pioneer lastminute.com leverages big-data analytics with speed at scale to provide business advantages to online travel services.

We'll explore how lastminute.com manages massive volumes of data to support cutting-edge machine-learning algorithms to allow for speed and automation in the rapidly evolving global online travel research and bookings business.

Connecting the Connected Car: What Needs to Happen Now

Connecting the Connected Car: What Needs to Happen Now

The connected car is one of the most highly anticipated technologies at this year’s CES. Constant vehicle connectivity to the Internet is opening up new possibilities in the areas of occupant safety, efficiency and convenience (to name a few), and will transform personal transportation as we know it.

Every major auto manufacturer is committing to the connected car in some way. While what “connected” means is likely to vary, there’s little doubt we’ll see cars get more sensors, more apps, more in-dash control systems and more automation.

Less clear is how we’ll take advantage of the technology that’s working its way into millions of new vehicles. Making cars smarter is one thing, but we also need to improve the IQ of roads, cities, bridges, garages, traffic systems and more so that connected cars have things to connect to. At HPE, we’re helping to usher in a smarter driving experience with a unique combination of technologies, expertise and partnerships.

Veikkaus digitally transforms as it emerges as new combined Finnish national gaming company

Veikkaus digitally transforms as it emerges as new combined Finnish national gaming company

The next BriefingsDirect Voice of the Customer digital transformation case study highlights how newly combined Finnish national gaming company, Veikkaus, is managing a complex merger process while also bringing more of a digital advantage to both its operations and business model. We'll now explore how Veikkaus uses a power big-data analytics platform to respond rapidly to the challenges of digitization.

WWT took an enterprise Tower of Babel and delivered comprehensive intelligent search

WWT took an enterprise Tower of Babel and delivered comprehensive intelligent search

The next BriefingsDirect Voice of the Customer digital transformation case study highlights how World Wide Technology, known as WWT, in St. Louis, found itself with a very serious yet somehow very common problem -- users simply couldn’t find relevant company content.

We'll explore how WWT reached deep into its applications, data, and content to rapidly and efficiently create a powerful Google-like, pan-enterprise search capability. Not only does it search better and empower users, the powerful internal index sets the stage for expanded capabilities using advanced analytics to engender a more productive and proactive digital business culture.

HPE's Andy Bergholz on NonStop Future and Innovation

HPE's Andy Bergholz on NonStop Future and Innovation

As everyone returns home from Connect Technical Boot Camp and starts to process what they learned, we take a look back on the last year in NonStop and what has been done in terms of innovation, and what we can expect in the future. As technology and the marketplace changes, so does the need for companies to be able to evolve with technology and trends and stay competitive in an ever-changing landscape. People are changing the way they do business and consuming products and services. With options like Netflix, Amazon, and even Uber making it easier than ever for consumers, businesses must adapt.

Nobody knows more about NonStop innovation than Andy Bergholz, director of engineering at HPE. “The marketplace is changing at an extraordinary pace,” he said. “So companies need to be able to defend and then attack against these market disrupters.” That’s why HPE has focused and comitted themselves to innovation to ensure that NonStop continues to provide businesses the options necessary for such forward progress: a focus on the digital core, a commitment to mission critical data and more choices on how to deploy and utilize NonStop.

HPE takes aim at customer needs for speed and agility in age of IoT, hybrid everything

HPE takes aim at customer needs for speed and agility in age of IoT, hybrid everything

A leaner, more streamlined Hewlett Packard Enterprise (HPE) advanced across several fronts at HPE Discover 2016 in London, making inroads into hybrid IT, Internet of Things (IoT), and on to the latest advances in memory-based computer architecture. All the innovations are designed to help customers address the age of digital disruption with speed, agility, and efficiency.

Addressing a Discover audience for the first time since HPE announced spinning off many software lines to Micro Focus, Meg Whitman, HPE President and CEO, said that company is not only committed to those assets, becoming a major owner of Micro Focus in the deal, but building its software investments.

"HPE is not getting out of software but doubling-down on the software that powers the apps and data workloads of hybrid IT," she said Tuesday at London's ExCel exhibit center.

"Massive compute resources need to be brought to the edge, powering the Internet of Things (IoT). ... We are in a world now where everything computes, and that changes everything," said Whitman, who has now been at the helm of HPE and HP for five years.

HPE's new vision: To be the leading provider of hybrid IT, to run today's data centers, and then bridge the move to multi-cloud and empower the intelligent edge, said Whitman. "Our goal is to make hybrid IT simple and to harness the intelligent edge for real-time decisions" to allow enterprises of all kinds to win in the marketplace, she said.