Cyber Observer – a powerful security dashboard that reminded me of an earlier time

The other day I received a note in LinkedIn from an individual I worked with back in EDS. He mentioned a company he is currently working for that is focused on security. Since security needs to be at the top of the list of concerns at all levels of organizations today, I thought I’d take a deeper look.

The software is called Cyber Observer (they have a fairly effective marketing overview movie on their site). Though this solution is focused on enterprise security monitoring, it reminded me of the data center monitoring programs that came out in the late 80s and 90s that provided status dashboards and information focused on reducing time to action for system events. CA Unicenter was one that was popular.

Back in the late 80s I had system administration leadership over the largest VAX data center that GM had. We had hundreds of VAXen, PDPs and HP 1000s of all sizes scattered over nine or ten plants. Keeping them all running required some significant insight into what was going on at a moments notice.

Fortunately, today folks can use the cloud for many of the types of systems we had to monitor, and the hardware monitoring is outsourced to the cloud providers. Plant floor systems are still an area that need to be monitored.

One of the issues we had keeping hundreds of machines running was that the flood of minor issues being logged and reported can easily lead to ‘alert fatigue’. Those responsible can loose the big picture (chicken little syndrome). Back then, we put a DECTalk in our admin area, when something really serious happened, it yelled at us until it was fixed. We thought that was pretty advanced for its time.

I asked how Cyber Observer handled this information overload concern. Since the software is primarily targeted at leaders/executives — we all know the attention span of most managers for technical issues. I also asked about a proactive (use of honeypots) vs. a reactive approach for the software. Now that both soft (HoneyD among others) and hard honeypots (Canary) are relatively easy to access, they should be part of any large organizations approach to security.

He explained that the alert and dashboarding system was very tunable at both the organizational and individual level.

Although it has more of a dashboard approach to sharing the information, details are available to show ‘why’ the concern reached the appropriate level.

An example he gave me was (for example) a new domain administrator being added in Active Directory. The score next to account management domain would go down and show red. When the user drills down, the alert would state that a new domain admin was added. The score in the system would be reduced and eventually the system baseline would adjust to the change although the score would remain lower. The administrative user would have to manually change the threshold or remove the new domain admin (if it is rogue or unapproved). Only then would the score would go back to its previous number (if no other events took place). Some threshold tolerances come preset out of the box based on expected values (for example if the NAC is in protect mode and not in alert mode, or if the Active Directory password complexity is turned-on — these scores are preset). Some thresholds are organizationally dependent and the user needs to set the proper thresholds as with the number of domain admins.

He also mentioned that if the system was connected to a honeypot that its information monitored the level of concern based on the shift of ‘background radiation’ was possible.

I don’t know much about this market and who the competitors are, but the software looked like a powerful tool that can be added to take latency out of the organizational response to this critical area. As machine learning techniques improve, the capabilities in this space should increase, recognizing anomalies more effectively over time. I was also not able to dig into the IoT capabilities that is a whole other level of information flow and concern.

The organization has a blog covering their efforts, but I would have expected more content since their hasn’t been a post this year.

Is AI a distraction???

AutomationI was recently in an exchange with a respected industry analyst where they stated that AI is not living up to its hype – they called AI ‘incremental’ and a ‘distraction’. This caught me a bit my surprise, since my view is that there are more capabilities and approaches available for AI practitioners than ever before. It may be the business and tech decision makers approach that is at fault.

It got me thinking about the differences in ‘small’ AI efforts vs. Enterprise AI efforts. Small AI are those innovative, quick efforts that can prove a point and deliver value and understanding in the near term. Big AI (and automation efforts) are those that are associated with ERP and other enterprise systems that take years to implement. These are likely the kinds of efforts that the analyst was involved with.

Many of the newer approaches enable the use of the abundance of capabilities available to mine the value out of the existing data that lies fallow in most organizations. These technologies can be tried out and applied in well defined, short sprints whose success criteria can be well-defined. If along the way, the answers were not quite what was expected, adjustments can be made, assumptions changed, and value can still be generated. The key is going into these projects with expectations but still flexible enough to change based on what is known rather than just supposition.

These approaches can be implemented across the range of business processes (e.g., budgeting, billing, support) as well as information sources (IoT, existing ERP or CRM). They can automate the mundane and free up high-value personnel to focus on generating even greater value and better service. Many times, these focused issues can be unique to an organization or industry and provide immediate return. This is not the generally not the focus of Enterprise IT solutions.

This may be the reason some senior IT leaders are disillusioned with the progress of AI in their enterprise. The smaller, high-value project’s contributions are round off error to their scope. They are looking for the big hit and by its very nature will be a compromise, if not a value to really move the ball in any definitive way – everyone who is deploying the same enterprise solution, will have access to the same tools…

My advice to those leaders disenchanted with the return from AI is to shift their focus. Get a small team out there experimenting with ‘the possible’. Give them clear problems (and expectations) but allow them the flexibility to bring in some new tools and approaches. Make them show progress but be flexible enough to understand that if their results point in a different direction, to shift expectations based on facts and results. There is the possibility of fundamentally different levels of costs and value generation.  

The keys are:

1)      Think about the large problems but act on those that can be validated and addressed quickly – invest in the small wins

2)      Have expectations that can be quantified and focus on value – Projects are not a ‘science fair’ or a strategic campaign just a part of the business

3)      Be flexible and adjust as insight is developed – just because you want the answer to be ‘yes’ doesn’t mean it will be, but any answer is valuable when compared to a guess

Sure, this approach may be ‘incremental’ (to start) but it should make up for that with momentum and results. If the approach is based on expectations, value generation and is done right, it should never be a ‘distraction’.

What’s the real outcome of Salesforce’s AI predictions?

automated decisionsYesterday. I was catching up on my technology email and came across this post stating that Salesforce now powers over 1B predictions every day for its customers. That’s a pretty interesting number to throw out there, but it makes me ask “so what?” How are people using these predictions to make greater business impact.

The Salesforce website states:

“Einstein is a layer of artificial intelligence that delivers predictions and recommendations based on your unique business processes and customer data. Use those insights to automate responses and actions, making your employees more productive, and your customers even happier. “

Another ‘nice’ statement. Digging into the material a bit more Einstein (the CRM AI functions from Salesforce) appears to provide analysis of previous deals and if a specific opportunity is likely to be successful, helping to prioritize your efforts. It improves the presentation of information with some insight into what it means. It appears to be integrated into the CRM system that the users are already familiar with.

For a tool that has been around since the fall of 2016, especially one that is based on analytics… I had difficulty finding any independent quantitative analysis of the impact. Salesforce did have a cheatsheet with some business impact analysis of the AI solution (and blog posts), but no real target market impact to provide greater context – who are these metrics based on.

It may be that I just don’t know where to look, but it does seem like a place for some deeper analysis and validation. The analysts could be waiting for other vendor’s solutions to compare against.

In the micro view, organizations that are going to dive into this pool will take a more quantitative approach, defining their past performance, expectations and validate actuals against predictions. That is the only way a business can justify the effort and improve. It is not sufficient to just put the capabilities out there and you’re done.

It goes back to the old adage:

“trust, but verify”

Simplicity, the next big thing?

Complex processRecently, Dynatrace conducted a survey of CIOs on their top challenges. Of the top six, almost all deal with concerns about complexity. There is no doubt there are numerous technologies being injected in almost every industry from a range of vendors. Integration of this multivendor cacophony is ripe with security risks and misunderstanding – whether it is your network or IoT vendor environment.

Humans have a limited capacity to handle complexity before they throw up their hands and just let whatever happens wash over them. That fact is one of the reasons AI is being viewed as the savior for the future. Back in 2008, I wrote a blog post for HP that mentioned:

“the advent of AI could allow us to push aside a lot of the tasks that we sometimes don’t have the patience for, tasks that are too rigorous or too arduous.”

IT organizations needs to shift their focus back to making the business environment understandable, not just injecting more automation or data collection. Businesses need to take latency out of decision making and increase the level of understanding and confidence. A whole new kind of macro-level (enterprise) human interface design is required. Unfortunately, this market is likely a bit too nebulous to be targeted effectively today other than through vague terms like analytics…  But based on the survey results, large scale understanding (and then demand) appears to be dawning on leadership.

The ROI for efforts to simplify and encourage action, should be higher than just adding a new tool to the portfolio ablaze in most organizations. We’ll see where the monies go though, since that ROI is likely to be difficult to prove when compared to the other shiny balls available.

Six thoughts on mobility trends for 2018

mobility walkLet’s face it, some aspects of mobility are getting long in the tooth. The demand for more capabilities is insatiable. Here are a few areas where I think 2018 will see some exciting capabilities develop. Many of these are not new, but their interactions and intersection should provide some interesting results and thoughts to include during your planning.

1. Further blurring and integration of IoT and mobile

We’re likely to see more situations where mobile recognizes the IoT devices around them to enhance contextual understanding for the user. We’ve seen some use of NFC and Bluetooth to share information, but approaches to embrace the environment and act upon the information available is still in its infancy. This year should provide some significant use cases and maturity.

2. Cloud Integration

By now most businesses have done much more than just stick their toe in the cloud Everything as a Service (XaaS) pool. As the number of potential devices in the mobility and IoT space expand, the flexibility and time to action that cloud solutions facilitate needs to be understood and put into practice. It is also time to take all the data coming in from these and transform that flow into true contextual understanding and action, also requiring a dynamic computing environment.

3. Augmented reality

With augmented reality predicted to expend to a market somewhere between $120 and $221 billion in revenues by 2021, we’re likely to see quite a bit of innovation in this space. The wide range of potential demonstrates the lack of a real understanding. 2018 should be a year where AR gets real.

4. Security

All discussions of mobility need to include security. Heck, the first month of 2018 has should have nailed the importance of security into the minds of anyone in the IT space. There were more patches (and patches of patches) on a greater range of systems than many would have believed possible just a short time ago. Recently, every mobile store (Apple, Android…) was found to have nefarious software that had to be exercised. Mobile developers need to be ever more vigilant, not just about the code they write but the libraries they use.

5. Predictive Analytics

Context is king and the use of analytics to increase the understanding of the situation and possible responses is going to continue to expand. As capabilities advance, only our imagination will hold this area back from increasing where and when mobile devices become useful. Unfortunately, the same can be said about the security issues that are based on using predictive analytics.

6. Changing business models

Peer to peer solutions continue to be the rage but with the capabilities listed above, whole new approaches to value generation are possible. There will always be early adopters who are willing to play with these and with the deeper understanding possibilities today new approaches to crossing the chasm will be demonstrated.

It should be an interesting year…

IT opportunities and cruising…

cruiseI recently went on a western Caribbean cruise on Royal Caribbean. This is the first cruise I’d been on in a couple of years and I found it interesting how much mobile device use there was on the boat. Everything from folks checking emails… to live streaming at the breakfast table (at an additional cost, of course). There still seemed to be numerous more subtle ways to enhance the cruise experience now that nearly everyone has an enhanced device.

There is an anecdote about cruising that for every couple that gets on a cruise, one of them doesn’t really want to be there. That’s probably a bit strong, but what’s true is that there are numerous activities going on at any one time and finding the right one to interest you could be improved.

I could easily see adding NFC or low power Bluetooth spots throughout the ship that personal devices could tap into for service information or even historic facts/trivia. As I drive across the country, I see numerous historic spot signs along the highway that’s because some people are interested in what’s happened at locations in the past. Adding some capabilities to share that information for the ship would be interesting:  where items of specific interest (music performers/performances, celebrity spotting, changes in ship design over the years) could be broadcast. It would make for an interesting gamification,  scavenger hunt and Pokémon Go like possibilities that would interest some on board.

Analytic data from IoT and business process systems could be shared to optimize the experience. For example, sharing how long the wait may be at “my time” dining. A news feed capability may be useful, so you can subscribe to information about where the ship is or when it will get into port. Naturally there will be a great deal of opportunity available to upsell passengers on jewelry, drinks, excursions… as well.

There may be some interesting peer-to-peer sharing experiences. The one I’ve thought about for a long time is: allowing folks to share their skills and interests so they could be notified if someone within 50 feet is an author/expert on a topic of interest. Or enable ad-hoc meetings, like in the case of our cruise where there was a quilting, veteran and Victorian dance group, that would have a public meeting at a specific time and place. These capabilities would encourage interactions with other passengers that they wouldn’t normally experience. These capabilities would have to be opt in though, to allow those who want to get away to have that experience as well.

The use of augmented reality also seems like a missed opportunity. An app to take some of the signpost information mentioned earlier and enhance it with directional information. This could help lead you to the experience you’d like to have, rather than the one you just settle for, based on what you know.

What I am getting at is: different people want a range of experiences on a cruise and its seems like there are numerous opportunities being missed by both the passengers and the cruise lines to make the most of the occasion, with relatively little additional effort. There are some significant privacy and customer satisfaction concerns, but I am sure a range of pilots would quickly point out the issues and possibilities.

Abundance and the value potential of IT — things have changed…

Since I have moved to a new blog site I decided to update a post on my foundational beliefs about IT, the future and what it should mean to business.

A number of years back, I posted that the real value for business is understanding unique and separating what was abundant from what was scarce and plan to take business advantage of that knowledge.

I came up with this model to look at how things have changed:

abundanceToday, there is an abundance of data coming in from numerous sources. A range of connection options can move the data around to an abundance of computing alternatives. Even the applications available to run on the data continues to grow almost beyond understanding. Various service providers and options even exist to quickly pull these together into custom (-ish) solutions.

Yet there are elements of the business that remain scarce or at least severely limited by comparison. The attention span of personnel, the security and privacy of our environment and even actions based on the contextual understanding of what’s happening persist in being scarce. Part of every organizations strategic planning (and enterprise architecture effort) needs to address how to use the abundance to maximize the value from the scarce elements and resources – since each business may have its own set of abundant and scare components.

For IT organizations one thing to keep in mind is: almost every system in production today was built from a scarcity model of never having enough compute, data… Those perspectives must be reassessed and the implications of value for the business that may be generated reevaluated, since that once solid foundation is no longer stable. The business that understands this shift and adjusts is going to have a significant advantage and greater flexibility.