I survived my first day at SAP #SapphireNow

The one area that both surprised and interested me most on the first day had little to do with the analytics or IoT space (although I did have some interesting discussions in those areas too). It was the SAP approach to their on-line store.

They have had many on-line stores in the past but now they are taking a different more ‘digital’ approach that is focused on selling direct to the consumer. This will change the relationship with the user and the enterprise based on consumption. This could disrupt their traditional buyer, the SAP sales force as well as their partners that perform system integration and consulting. It will be interesting to see if this level of change can take place without too much disruption.

By selling tools like Lumira with a free version, then a low friction purchase option with a credit card a business could easily see this tool enter into its portfolio of resources without their knowledge. They have implemented the purchase process so that if a feature of a premium version is needed you are dropped into the store. Anyone who has done on-line gaming recently has likely run into this behavior. This kind of stealth selling is inevitable and will accelerate the kind of shadow IT has been discussed for years.

I asked the people at the booth about what happens when someone buys it on their own version and the company purchases a master agreement. The answers varied a bit but the individual has a choice to roll into the agreement or continue to pay on their own. Look to the terms and conditions (that no one reads typically) for the details.

There is also the concern about who will support anything that gets created once the business becomes addicted. Everyone likely remembers the years of Excel Hell. Hopefully that will not happen but I am still checking into how change management elements can be put in place for end user developed elements.

My greatest concerns is that the traditional command and control IT organization will be very frustrated by this, while the digital purists will be confused by the resistance – it may be just outside their contextual understanding. SAP stated they will be opening these capabilities up for 3rd parties to sell their capabilities and that will have its own problems. Service providers usually sell apps as a mechanism to facilitate up-sell into consulting and integration. SAP is trying to ensure what gets into the store is valuable on its own. Some of the service providers will likely have a hard time understanding these implications as well.

It was stated (many times) in the first day that business models are changing and SAP seems to be doing its part to be disruptive, even if most of its customers haven’t internalized the implications.

Advertisements

Waste can be Good – it’s all relative

AbundanceAs businesses makes the transition to where the edge of the enterprise is wired into the operational processes of the business, we will start to consume our resources quite differently than we have in the past. We can use the abundance of computing capabilities to shed light on all the dark data currently available to develop a deeper contextual understanding of situations we encounter. Money may not be growing on trees, but there is much more we can be doing.

An article in Wired magazine back in 2009 discussed how: Tech Is Too Cheap to Meter: It’s Time to Manage for Abundance, Not Scarcity. In this world of exponential increases in capability, 2009 is ancient history, even so, the article is useful. It works through examples like how Alan Kay used the precious resources of the computer to display pictures on the screen instead of just textual data. George Gilder called this “wasting transistors” — making people more productive by using the transistors (computing capability) available.

The funny thing about waste is that it’s all relative to your sense of scarcity.

As we look to use higher levels of automation to handle more “normal” activities and focus people’s attention to turning anomalies into opportunities, we’ll use pattern recognition and other techniques that may appear to waste cycles. I hear people today complain about the expense of cloud computing and that it is out of control. That is more about what they use these resources for, how they measure impact and exercise control than anything to do with cost, at least from my perspective. As more capabilities become available and algorithms improve, we’ll need to do even more with more – not less.

The Wired article shows how behavior needs to change as we move from a perspective of scarcity to abundance:

From a perspective of Scarcity or Abundance

Scarcity Abundance
Rules Everything is forbidden unless it is permitted Everything is permitted unless it is forbidden
Social model Paternalism (We know what’s best) Egalitarianism (You know what’s best)
Profit plan Business model We’ll figure it out
Decision process Top-down Bottom-up
Organizational structure Command and control Out of control

This kind of shift in perspective is disruptive, useful and the right thing to do to take maximum advantage of a truly scarce resource – the human attention span.

Thoughts from a discussion about architecture

evaluationYesterday, I had a long discussion with Stephen Heffner the creator of XTRAN (and president of XTRAN, LLC). XTRAN is a meta transformation tool that excels at automating software work – sorry that is the best description I could come up for a tool that can create solutions to analyze and translate between software languages, data structures and even project work products. When you first read about its capabilities it sounds like magic. There are numerous working examples available of its capabilities so you can see its usefulness for yourself.

He and I were talking about the issues and merits of Enterprise Architecture. He wrote piece titled, What is Enterprise Architecture?, where he describes his views on the EA function. Stephen identifies three major impediments to effective EA:

  • Conflating EA with IT
  • Aligning EA with just transformation
  • Thinking the EA is responsible for strategy

We definitely agreed that today’s perspective in most businesses that the EA function is embedded within IT does not align well with the strategic needs of the business. The role is much broader than IT and needs to embrace the broader business issues that IT should support.

I had a bit of problem with the EA alignment with transformation but that may just be a difference in context. One of the real killers of EA for me is the focus on work products and not outcomes. The EA should always have a focus on greater flexibility for the business, addressing rigor but not increasing rigidity. Rigidity is aligned with death – hence the term rigor mortis. To me, the EA function always has a transformational element.

The final point was that the EA supports strategy and the business needs to have a strategy. The EA is not the CEO and the CEO is probably not an EA.  The EA does need to understand the current state of the business environment though. I was talking with an analyst one day who told me that an EA needs to focus on the vision and they shouldn’t worry about a current situational assessment. My response was that “If you don’t know where you are, you’ll not be able to define a journey to where you need to be.” Stephen agreed with that perspective.

My view is that there are 4 main elements of an effective architecture:

  • People – Architecture lives at the intersection of business and technology. People live at the focus of that intersection, not technology. Architectural efforts should focus on the effect upon the people involved. What needs to happen? How will it be measured? These factors can be used to make course corrections along the way, once you realize: an architecture is never finished. If it doesn’t deliver as expected, change it. Make the whole activity transparent, so that people can buy in, instead of throw stones. My view is that if I am talking with someone about architecture and they don’t see its value, it is my fault.
  • Continuous change – When you begin to think of the business as dynamic and not static, the relationship with the real world becomes clear. In nature, those species that are flexible and adjust to meet the needs of the environment can thrive – those that can’t adjust die off.
    Architectures need to have standards, but it also needs to understand where compromises can be made. For example, Shadow IT It is better to understand and facilitate its effective use (through architecture), rather than try and stand in the way and get run over.
    In a similar way, the link between the agile projects and the overall architecture need to be recursive, building upon the understanding that develops. The architecture does not stand alone.
    Architecture development can also have short sprints of understanding, documenting and standardizing the technical innovations that take place, while minimizing technical debt.
  • Focus on business-goal based deliverables – Over the years, I’ve seen too many architectural efforts end up as shelf-ware. In the case of architecture, just-in-time is probably the most effective and accurate approach since the technology and business are changing continuously. Most organizations would just laugh at a 5 year technology strategy today, after all many of the technical trends are predictable. So I don’t mean you shouldn’t frame out a high-level one – just ‘don’t believe your own press’.
    If the architecture work products can be automated or at least integrated with the tooling used in the enterprise, it will be more accurate and useful. This was actually a concept that Stephen and I discussed in depth. The concept of machine and human readable work products should be part of any agile architecture approach.
    From a goal-based perspective, the architecture needs to understand at a fundamental level what is scarce for the organization and what is abundant and then maximize the value generated from what is scarce – or at least unique to the organization.
  • Good enough – Don’t let the perfect architecture stand in the way of one that is ‘good enough’ for today. All too often I’ve seen architecture analysis go down to 2 or 3 levels of detail. Then people say “if 2 is good, let’s go to 5 levels of depth.” Unfortunately, with each level of detail the cost to develop and maintain goes up by an order of magnitude – know when to stop. I’ve never seen a single instance of where these highly detailed architecture definitions where maintained more than 2 or 3 years, since they may actually cost as much to maintain as it took to create them. Few organizations have that level of intestinal fortitude to keep that up for long.
    The goal should be functional use, not a focus on perfection. Architecting the simplest solution what works today is generally best. If you architect the solution for something that will be needed 5 years out, either the underlying business need or the technical capabilities will change before it will actually be used.

None of this is really revolutionary. Good architects have been taking an approach like this for years. It is just easy to interpret some of the architecture process materials from an ivory tower (or IT only) perspective.

Abundance and the value potential of IT — things have changed…

Since I have moved to a new blog site I decided to update a post on my foundational beliefs about IT, the future and what it should mean to business.

A number of years back, I posted that the real value for business is understanding unique and separating what was abundant from what was scarce and plan to take business advantage of that knowledge.

I came up with this model to look at how things have changed:

abundanceToday, there is an abundance of data coming in from numerous sources. A range of connection options can move the data around to an abundance of computing alternatives. Even the applications available to run on the data continues to grow almost beyond understanding. Various service providers and options even exist to quickly pull these together into custom (-ish) solutions.

Yet there are elements of the business that remain scarce or at least severely limited by comparison. The attention span of personnel, the security and privacy of our environment and even actions based on the contextual understanding of what’s happening persist in being scarce. Part of every organizations strategic planning (and enterprise architecture effort) needs to address how to use the abundance to maximize the value from the scarce elements and resources – since each business may have its own set of abundant and scare components.

For IT organizations one thing to keep in mind is: almost every system in production today was built from a scarcity model of never having enough compute, data… Those perspectives must be reassessed and the implications of value for the business that may be generated reevaluated, since that once solid foundation is no longer stable. The business that understands this shift and adjusts is going to have a significant advantage and greater flexibility.

Enterprise architecture in a world of automated change

action 002I posted the other day on the Enterprise CIO blog an entry about the CIO’s role and the self-driving business that got me thinking about the Enterprise architect and the processes (e.g., TOGAF). There seems to be a lack of any real automation thread. Do you see one? This clearly needs to be addressed.

One of the primary roles of an Enterprise Architect is to identify, define and support business transformation projects. The capabilities of the technologies and the business drivers have changed quite dramatically in recent years but the processes in many ways remain the same. EA practitioners will need to take a very different approach to their role going forward and how it can shape the business.

Automation will be playing an ever increasing role in business. One concept that needs to be address is that of Enterprise Context Management, which is one of those foundational elements needed for automation, yet that’s not really part of any EA process work product – at least that I know of. To me this is like a repository of enterprise state (for lack of a better term) and who subscribes to the changes in state.

Gartner came up with the term of Vanguard Enterprise Architect, describing EAs that are focused on digital business techniques and its value to the business. As part of this more forward looking approach, architects need to understand that it’s not about creating documents but about blending people, process and system to meet business needs. Through the use of automation techniques the environment will still need to be human-centric, it will just use those individual’s attention more efficiently and effectively.

The days of EA’s gathering, documenting and then just placing a few recommendations on the table are likely over. EA is not about just hardware, software and projects. Sure those play a part but now it is services, relationships and a holistic ecosystem view aligned to desired outcomes. The expectation should be for the EA to deliver business outcomes, backed by contextual depth of impact and analytics that maximize the value from one of the scarcest resources in any business, the creativity of its people.

7 Questions to Help Look Strategically at IoT

question and analyticsThere are still many people who view the Internet of Things as focused on ‘the things’ and not the data they provide. Granted there are definitely some issues with the thing itself, but there are also concerns for enterprise, like the need to monitor the flow of information coming from these things, especially as we begin to automate the enterprise response to events.

A holistic perspective is needed and these are the top issues I believe an organization needs to think through when digging into their IoT strategy:

  1. What business value do the devices provide – independent of the data they collect?
    Having said that it is not really about the devices, it remains true that the devices should be delivering value in themselves – the data may be just a side effect of this role. Understanding those functions will increase the reliability and usefulness of the data over the long haul. No one wants to put an approach consuming a data stream just to have it dry up.
  2. What access will the devices have to the enterprise?
    Is it bi-directional? If it is the security risk of the devices is significantly higher than those that just provide raw data. If a positive feedback loop exists, it needs to be reinforced and secure. If the data flow is too narrow for this level of security, the need for bi-directional information flow needs to be scrutinized – if the interaction is that valuable, it really needs to be protected. Think about the issue of automotive data bus attacks, as an example.
  3. If attacked, how can the devices be updated?
    Does the devices support dynamic software updates and additions, if so how can those be delivered, by whom? Users of devices may download applications that contain malware, since it can be disguised as a game, security patch, utility, or other useful application. It is difficult for most to tell the difference between a legitimate application and one containing malware. For example, an application could be repackaged with malware and a consumer could inadvertently download it onto a device that is part of your IoT environment. Not all IoT devices are limited SCADA solutions, they may be smartphones, TVs… pretty much anything in our environment in the future.
  4. How will the data provided be monitored?
    Wireless data can be easily intercepted. When a wireless transmission is not encrypted, data can be easily intercepted by eavesdroppers, who may gain unauthorized access to sensitive information or derived behaviors. The same may be true of even a wired connection. Understanding the frequency of updates and shifts in data provided is usually an essential part of IoT’s value, and it should be part of the security approach as well.
  5. Can any personal or enterprise contextual information leak from the device connection?
    I blogged a while back about the issue of passive oversharing. As we enable more devices to provide information, we need to understand how that data flow can inadvertently build a contextual understanding about the business or the personnel and their behavior for other than the intended use.
  6. Is the device’s role in collecting information well-known and understood?
    No one like the thought of ‘big brother’ looking over their shoulder. People can easily feel offended or manipulated if a device enters their work environment and provides data they feel is ‘about them’ without their knowing this is taking place. A solid communications plan that keeps up with the changes in how the data is used will be a good investment.
  7. Who are all the entities that consume this data?
    As IoT data is used to provide a deeper contextual understanding of the environment, the contextual understanding may be shared with suppliers, partners and customers. These data flows need to be understood and tracked, like any consumer relationship, otherwise they may easily turn into a string of dominoes that enable unexpected shifts in results as they change. Awareness of enterprise context management will be growing in importance over the coming years – note that was not content management but context management.

All these issues are common to IT systems, but with an IoT deployment, the normal IT organization may only be able to influence how they are addressed.