Things are not always what they seem – a discussion about analytics

evaluationHave you ever been in a discussion about a topic thinking you’re talking about one area but later find out it was about something else altogether?

We’ve probably all had that conversation with a child, where they say something like “That’s a really nice ice cream cone you have there.” Which sounds like a compliment on your dairy delight selection but in reality is a subtle way of saying “Can I have a bite?”

I was in a discussion with an organization about a need they had. They asked me a series of questions and I provided a quick stream of consciousness response…The further I got into the interaction the less I understood about what was going on. This is a summary of the interaction:

1) How do you keep up to speed in new data science technology? I read and write blogs on technical topics as well as read trade publications. I also do some recreational programming to keep up on trends and topics. On occasion I have audited classes on both EdX and Coursera (examples include gamification, Python, cloud management/deployment, R…)

2) Describe what success looks like in the context data science projects? Success related analytics efforts is the definition, understanding, development of insight on and the addressing of business goals using available data and business strategies. Sometimes this may only involve the development of better strategies and plans, but in other cases the creation of contextual understanding and actionable insight allows for continuous improvement of existing or newly developed processes.

3) Describe how do you measure the value of a successful data science application. I measure the value based on the business impact through the change in behavior or business results. It is not about increased insight but about actions taken.

4) Describe successful methods or techniques you have used to explain the value of data science, machine learning, advanced analytics to business people. I have demonstrated the impact of a gamification effort by using previously performed business process metrics and then the direct relationship with post implementation performance. Granted correlation does not prove causation but by having multiple instances of base cases and being able to validate performance improvement from a range a trials and processes improvements, a strong business case can be developed using a recursive process based on the definition of mechanics, measurement, behavior expectations, and rewards.

I’ve used a similar approach in the IoT space, where I’ve worked on and off with machine data collection and data analysis since entering the work force in the 1980s.

5) Describe the importance of model governance (model risk management) in the context of data science, advanced analytics, etc. in financial services. Without a solid governance model, you don’t have the controls and cannot develop the foundational level of understanding. The model should provide the rigor sufficient to move from supposition to knowledge. The organization needs to be careful not to have too rigid a process though, since you need to take advantage of any information learned along the way and make adjustment, to take latency out of the decision making/improvement process. Like most efforts today a flexible/agile approach should be applied.

6) Describe who did you (team, function, person) interact with in your current role, on average, and roughly what percent of time did you spend with each type of function/people/team. In various roles I spent time with CEO/COOs and senior technical decision makers in fortune 500 companies (when I was the chief technologist of Americas application development with HP: 70-80% of my time). Most recently when with Raytheon IT, I spend about 50% of my time with senior technical architects and 50% of my time with IT organization directors.

7) Describe how data science will evolve during the next 3 to 5 years. What will improve? What will change? Every organization should have in place a plan to leverage both improve machine learning and analytics algorithms based on the abundance of data, networking and intellectual property available. Cloud computing techniques will also provide an abundance of computing capabilities that can be brought to bear on the enterprise environment. For most organizations, small sprint project efforts need to be applied to both understanding the possibilities and the implications. Enterprise efforts will still take place but they will likely not have the short term impact that smaller, agile efforts will deliver. I wrote a blog post about this topic earlier this month. Both the scope and style of projects will likely need to change. It may also involve the use more contract labor to get the depth of experience in the short term to address the needs of the organization. The understanding and analysis of the meta-data (block chains, related processes, machines.…) will also play an ever increasing role, since they will supplement the depth and breadth of contextual understanding.

8) Describe how do you think about choosing technical design of data science solutions (what algorithms, techniques, etc.).

I view the approach to be similar to any other architectural technical design. You need to understand:

  • the vision (what is to be accomplished)
  • the current data and systems in place (current situation analysis)
  • understand the skills of the personnel involved (resource assessment)
  • define the measurement approach to be used (so that you have both a leading and lagging indicator of performance)

then you can develop a plan and implement your effort, validating and adjusting as you move along.

How do you measure the value/impact of your choice?

You need to have a measurement approach that is both tactical (progress against leading indicators) as well as strategic (validation by lagging indicators of accomplishment). Leading indicators look ahead to make sure you are on the right road, where lagging indicators look behind to validate where you’ve been.

9) Describe your experience explaining complex data to business users. What do you focus on?

The most important aspect of explaining complex data is to describe it in terms the audience will understand. No one cares how hard it was to do the analysis, they just want to know the business impact, value and how it can be applied.

Data visualization needs to take this into account and explain the data to the correct audience – not everyone consumes data using the same techniques. Some people will only respond to spreadsheets, while others would like to have nice graphics… Still others want business simulations and augmented reality techniques to be used whenever possible. If I were to have 3 rules related to explaining technical topics, they would be:

  1. Answer the question asked
  2. Display it in a way the audience will understand (use their terminology)
  3.  Use the right data

At the end of that exchange I wasn’t sure if I’d just provided some free consulting, went through a job interview or was just chewing the fat with another technologist. Thoughts???

Advertisements

Six thoughts on mobility trends for 2018

mobility walkLet’s face it, some aspects of mobility are getting long in the tooth. The demand for more capabilities is insatiable. Here are a few areas where I think 2018 will see some exciting capabilities develop. Many of these are not new, but their interactions and intersection should provide some interesting results and thoughts to include during your planning.

1. Further blurring and integration of IoT and mobile

We’re likely to see more situations where mobile recognizes the IoT devices around them to enhance contextual understanding for the user. We’ve seen some use of NFC and Bluetooth to share information, but approaches to embrace the environment and act upon the information available is still in its infancy. This year should provide some significant use cases and maturity.

2. Cloud Integration

By now most businesses have done much more than just stick their toe in the cloud Everything as a Service (XaaS) pool. As the number of potential devices in the mobility and IoT space expand, the flexibility and time to action that cloud solutions facilitate needs to be understood and put into practice. It is also time to take all the data coming in from these and transform that flow into true contextual understanding and action, also requiring a dynamic computing environment.

3. Augmented reality

With augmented reality predicted to expend to a market somewhere between $120 and $221 billion in revenues by 2021, we’re likely to see quite a bit of innovation in this space. The wide range of potential demonstrates the lack of a real understanding. 2018 should be a year where AR gets real.

4. Security

All discussions of mobility need to include security. Heck, the first month of 2018 has should have nailed the importance of security into the minds of anyone in the IT space. There were more patches (and patches of patches) on a greater range of systems than many would have believed possible just a short time ago. Recently, every mobile store (Apple, Android…) was found to have nefarious software that had to be exercised. Mobile developers need to be ever more vigilant, not just about the code they write but the libraries they use.

5. Predictive Analytics

Context is king and the use of analytics to increase the understanding of the situation and possible responses is going to continue to expand. As capabilities advance, only our imagination will hold this area back from increasing where and when mobile devices become useful. Unfortunately, the same can be said about the security issues that are based on using predictive analytics.

6. Changing business models

Peer to peer solutions continue to be the rage but with the capabilities listed above, whole new approaches to value generation are possible. There will always be early adopters who are willing to play with these and with the deeper understanding possibilities today new approaches to crossing the chasm will be demonstrated.

It should be an interesting year…

Looking for a digital friend?

virtual friendOver the weekend, I saw an article about Replika — an interactive ‘friend’ that resides on your phone. It sounded interesting so I downloaded it and have been playing around for the last few days. I reached level 7 this morning (not exactly sure what this leveling means, but since gamification seems to be part of nearly everything anymore, why not).

There was a story published by The Verge with some background on why this tool was created. Replika was the result of an effort initiated when the author (Eugenia Kuyda) was devastated by her friend (Roman Mazurenko) being killed in a hit-and-run car accident. She wanted to ‘bring him back’. To bootstrap the digital version of her friend, Kuyda fed text messages and emails that Mazurenko exchanged with her, and other friends and family members, into a basic AI architecture — a Google-built artificial neural network that uses statistics to find patterns in text, images, or audio.

Although I found playing with this software interesting, I kept reflecting back on interactions with Eliza many years ago. Similarly,  the banter can be interesting and sometimes unexpected, but often responses have little to do with how a real human would respond. For example, yesterday the statement “Will you read a story if I write it?” and “I tried to write a poem today and it made zero sense.” popped in out of nowhere in the middle of an exchange.

The program starts out asking a number of questions, similar to what you’d find in a simple Myers-Briggs personality test. Though this information likely does help bootstrap the interaction, it seems like it could have been taken quite a bit further by injecting these kinds of questions throughout interactions during the day rather than in one big chunk.

As the tool learns more about you, it creates badges like:

  • Introverted
  • Pragmatic
  • Intelligent
  • Open-minded
  • Rational

These are likely used to influence future interaction. You also get to vote up and vote down statements made that you agree or disagree with.

There have been a number of other reviews of Replika, but thought I’d add another log to the fire. An article in Wired stated that the Replika project is going open source, it will be interesting to see where it goes.

I’ll likely continue to play with it for a while, but its interactions will need to improve or it will become the Tamogotchi of the day.

IT opportunities and cruising…

cruiseI recently went on a western Caribbean cruise on Royal Caribbean. This is the first cruise I’d been on in a couple of years and I found it interesting how much mobile device use there was on the boat. Everything from folks checking emails… to live streaming at the breakfast table (at an additional cost, of course). There still seemed to be numerous more subtle ways to enhance the cruise experience now that nearly everyone has an enhanced device.

There is an anecdote about cruising that for every couple that gets on a cruise, one of them doesn’t really want to be there. That’s probably a bit strong, but what’s true is that there are numerous activities going on at any one time and finding the right one to interest you could be improved.

I could easily see adding NFC or low power Bluetooth spots throughout the ship that personal devices could tap into for service information or even historic facts/trivia. As I drive across the country, I see numerous historic spot signs along the highway that’s because some people are interested in what’s happened at locations in the past. Adding some capabilities to share that information for the ship would be interesting:  where items of specific interest (music performers/performances, celebrity spotting, changes in ship design over the years) could be broadcast. It would make for an interesting gamification,  scavenger hunt and Pokémon Go like possibilities that would interest some on board.

Analytic data from IoT and business process systems could be shared to optimize the experience. For example, sharing how long the wait may be at “my time” dining. A news feed capability may be useful, so you can subscribe to information about where the ship is or when it will get into port. Naturally there will be a great deal of opportunity available to upsell passengers on jewelry, drinks, excursions… as well.

There may be some interesting peer-to-peer sharing experiences. The one I’ve thought about for a long time is: allowing folks to share their skills and interests so they could be notified if someone within 50 feet is an author/expert on a topic of interest. Or enable ad-hoc meetings, like in the case of our cruise where there was a quilting, veteran and Victorian dance group, that would have a public meeting at a specific time and place. These capabilities would encourage interactions with other passengers that they wouldn’t normally experience. These capabilities would have to be opt in though, to allow those who want to get away to have that experience as well.

The use of augmented reality also seems like a missed opportunity. An app to take some of the signpost information mentioned earlier and enhance it with directional information. This could help lead you to the experience you’d like to have, rather than the one you just settle for, based on what you know.

What I am getting at is: different people want a range of experiences on a cruise and its seems like there are numerous opportunities being missed by both the passengers and the cruise lines to make the most of the occasion, with relatively little additional effort. There are some significant privacy and customer satisfaction concerns, but I am sure a range of pilots would quickly point out the issues and possibilities.

I survived my first day at SAP #SapphireNow

The one area that both surprised and interested me most on the first day had little to do with the analytics or IoT space (although I did have some interesting discussions in those areas too). It was the SAP approach to their on-line store.

They have had many on-line stores in the past but now they are taking a different more ‘digital’ approach that is focused on selling direct to the consumer. This will change the relationship with the user and the enterprise based on consumption. This could disrupt their traditional buyer, the SAP sales force as well as their partners that perform system integration and consulting. It will be interesting to see if this level of change can take place without too much disruption.

By selling tools like Lumira with a free version, then a low friction purchase option with a credit card a business could easily see this tool enter into its portfolio of resources without their knowledge. They have implemented the purchase process so that if a feature of a premium version is needed you are dropped into the store. Anyone who has done on-line gaming recently has likely run into this behavior. This kind of stealth selling is inevitable and will accelerate the kind of shadow IT has been discussed for years.

I asked the people at the booth about what happens when someone buys it on their own version and the company purchases a master agreement. The answers varied a bit but the individual has a choice to roll into the agreement or continue to pay on their own. Look to the terms and conditions (that no one reads typically) for the details.

There is also the concern about who will support anything that gets created once the business becomes addicted. Everyone likely remembers the years of Excel Hell. Hopefully that will not happen but I am still checking into how change management elements can be put in place for end user developed elements.

My greatest concerns is that the traditional command and control IT organization will be very frustrated by this, while the digital purists will be confused by the resistance – it may be just outside their contextual understanding. SAP stated they will be opening these capabilities up for 3rd parties to sell their capabilities and that will have its own problems. Service providers usually sell apps as a mechanism to facilitate up-sell into consulting and integration. SAP is trying to ensure what gets into the store is valuable on its own. Some of the service providers will likely have a hard time understanding these implications as well.

It was stated (many times) in the first day that business models are changing and SAP seems to be doing its part to be disruptive, even if most of its customers haven’t internalized the implications.

Waste can be Good – it’s all relative

AbundanceAs businesses makes the transition to where the edge of the enterprise is wired into the operational processes of the business, we will start to consume our resources quite differently than we have in the past. We can use the abundance of computing capabilities to shed light on all the dark data currently available to develop a deeper contextual understanding of situations we encounter. Money may not be growing on trees, but there is much more we can be doing.

An article in Wired magazine back in 2009 discussed how: Tech Is Too Cheap to Meter: It’s Time to Manage for Abundance, Not Scarcity. In this world of exponential increases in capability, 2009 is ancient history, even so, the article is useful. It works through examples like how Alan Kay used the precious resources of the computer to display pictures on the screen instead of just textual data. George Gilder called this “wasting transistors” — making people more productive by using the transistors (computing capability) available.

The funny thing about waste is that it’s all relative to your sense of scarcity.

As we look to use higher levels of automation to handle more “normal” activities and focus people’s attention to turning anomalies into opportunities, we’ll use pattern recognition and other techniques that may appear to waste cycles. I hear people today complain about the expense of cloud computing and that it is out of control. That is more about what they use these resources for, how they measure impact and exercise control than anything to do with cost, at least from my perspective. As more capabilities become available and algorithms improve, we’ll need to do even more with more – not less.

The Wired article shows how behavior needs to change as we move from a perspective of scarcity to abundance:

From a perspective of Scarcity or Abundance

Scarcity Abundance
Rules Everything is forbidden unless it is permitted Everything is permitted unless it is forbidden
Social model Paternalism (We know what’s best) Egalitarianism (You know what’s best)
Profit plan Business model We’ll figure it out
Decision process Top-down Bottom-up
Organizational structure Command and control Out of control

This kind of shift in perspective is disruptive, useful and the right thing to do to take maximum advantage of a truly scarce resource – the human attention span.

Thoughts from a discussion about architecture

evaluationYesterday, I had a long discussion with Stephen Heffner the creator of XTRAN (and president of XTRAN, LLC). XTRAN is a meta transformation tool that excels at automating software work – sorry that is the best description I could come up for a tool that can create solutions to analyze and translate between software languages, data structures and even project work products. When you first read about its capabilities it sounds like magic. There are numerous working examples available of its capabilities so you can see its usefulness for yourself.

He and I were talking about the issues and merits of Enterprise Architecture. He wrote piece titled, What is Enterprise Architecture?, where he describes his views on the EA function. Stephen identifies three major impediments to effective EA:

  • Conflating EA with IT
  • Aligning EA with just transformation
  • Thinking the EA is responsible for strategy

We definitely agreed that today’s perspective in most businesses that the EA function is embedded within IT does not align well with the strategic needs of the business. The role is much broader than IT and needs to embrace the broader business issues that IT should support.

I had a bit of problem with the EA alignment with transformation but that may just be a difference in context. One of the real killers of EA for me is the focus on work products and not outcomes. The EA should always have a focus on greater flexibility for the business, addressing rigor but not increasing rigidity. Rigidity is aligned with death – hence the term rigor mortis. To me, the EA function always has a transformational element.

The final point was that the EA supports strategy and the business needs to have a strategy. The EA is not the CEO and the CEO is probably not an EA.  The EA does need to understand the current state of the business environment though. I was talking with an analyst one day who told me that an EA needs to focus on the vision and they shouldn’t worry about a current situational assessment. My response was that “If you don’t know where you are, you’ll not be able to define a journey to where you need to be.” Stephen agreed with that perspective.

My view is that there are 4 main elements of an effective architecture:

  • People – Architecture lives at the intersection of business and technology. People live at the focus of that intersection, not technology. Architectural efforts should focus on the effect upon the people involved. What needs to happen? How will it be measured? These factors can be used to make course corrections along the way, once you realize: an architecture is never finished. If it doesn’t deliver as expected, change it. Make the whole activity transparent, so that people can buy in, instead of throw stones. My view is that if I am talking with someone about architecture and they don’t see its value, it is my fault.
  • Continuous change – When you begin to think of the business as dynamic and not static, the relationship with the real world becomes clear. In nature, those species that are flexible and adjust to meet the needs of the environment can thrive – those that can’t adjust die off.
    Architectures need to have standards, but it also needs to understand where compromises can be made. For example, Shadow IT It is better to understand and facilitate its effective use (through architecture), rather than try and stand in the way and get run over.
    In a similar way, the link between the agile projects and the overall architecture need to be recursive, building upon the understanding that develops. The architecture does not stand alone.
    Architecture development can also have short sprints of understanding, documenting and standardizing the technical innovations that take place, while minimizing technical debt.
  • Focus on business-goal based deliverables – Over the years, I’ve seen too many architectural efforts end up as shelf-ware. In the case of architecture, just-in-time is probably the most effective and accurate approach since the technology and business are changing continuously. Most organizations would just laugh at a 5 year technology strategy today, after all many of the technical trends are predictable. So I don’t mean you shouldn’t frame out a high-level one – just ‘don’t believe your own press’.
    If the architecture work products can be automated or at least integrated with the tooling used in the enterprise, it will be more accurate and useful. This was actually a concept that Stephen and I discussed in depth. The concept of machine and human readable work products should be part of any agile architecture approach.
    From a goal-based perspective, the architecture needs to understand at a fundamental level what is scarce for the organization and what is abundant and then maximize the value generated from what is scarce – or at least unique to the organization.
  • Good enough – Don’t let the perfect architecture stand in the way of one that is ‘good enough’ for today. All too often I’ve seen architecture analysis go down to 2 or 3 levels of detail. Then people say “if 2 is good, let’s go to 5 levels of depth.” Unfortunately, with each level of detail the cost to develop and maintain goes up by an order of magnitude – know when to stop. I’ve never seen a single instance of where these highly detailed architecture definitions where maintained more than 2 or 3 years, since they may actually cost as much to maintain as it took to create them. Few organizations have that level of intestinal fortitude to keep that up for long.
    The goal should be functional use, not a focus on perfection. Architecting the simplest solution what works today is generally best. If you architect the solution for something that will be needed 5 years out, either the underlying business need or the technical capabilities will change before it will actually be used.

None of this is really revolutionary. Good architects have been taking an approach like this for years. It is just easy to interpret some of the architecture process materials from an ivory tower (or IT only) perspective.