Voice recognition project completed at UTD

Every semester I try and work with some students at UTD by facilitating a ‘capstone’ project. It’s another dimension of my support for STEM education.education2 Yesterday, they gave their presentation to their professor and class.

This semester the project was creating an Android based speech recognition solution to facilitate a Voice-based Inspection and Evaluation Framework. We shied away from using Google’s speech recognition, since we wanted off-line capabilities, as well as enhanced security/privacy. Addressing this expectation was one of the first issues the team had to conquer.

They were able to identify and implement an open source library providing the speech recognition (PocketSphinx). They also used Android.Speech.tts for text-to-speech interaction with the user.

The team created a visual programming environment to graphically define a flowchart and export that to an XML file that the mobile device was able to use to facilitate the inspection process. The mobile application could have a number of these stored for later use.

The end product was able to handle a range of speech recognition needs:

  • Yes/no
  • Answer from a list of valid responses (e.g., States)
  • Answer with a number (range checked)
  • Free form sound capture

Overall, I was very impressed with what these students were able to accomplish during the semester and the quality of the Software Life Cycle work products they were able to produce. Naturally, since we didn’t know exactly what they were going to be able to accomplish they used a modified agile approach – since they still had to produce the work products require for the class based on a predefined time table.  We incorporated the concept of designing specific sprints around producing those work products as well as the typical need to define, document and validate requirements.

I started the project while working at HP and Dave Gibson and Cliff Wilke helped facilitate it to the end (they are still with HP).

Advertisements

Security certificate maintenance – there must be a better way

Broken-chainOver the last few years, I’ve seen numerous instances where will maintained systems that are run by organizations with good operational records have fallen over, caused by security certificate expiration.

Just last week, Google Mail went down for a significant time when their security key chain broke (note Google’s use of SHA-1 internally – but that’s a whole other issue). Gmail is a solution that is core to an increasing % of the population, schools and businesses. Most people likely believe that Google operations are well run and world class – yet they stumbled in the same way that I’ve seen many others before.

A reliable and rigorous approach is needed for organizations to track their certificate chains that proactively warns the organization before they expire, since it will take hours to repair them once they break. There are many critical tasks that come with certificate management, and ignoring or mishandling any one of them can set the stage for Web application exploits or system downtime.

These certificates (which contain the keys) are the cornerstone to the organization’s cryptography-based defense. As the market-facing application portfolio of an organization expands, the number of certificates will also expand and the key chains can get longer with more convoluted interrelationships as well (especially if not planned and just allowed to evolve). Additionally, the suite of certificate products from vendors can be confusing. There are different levels of validation offered, numerous hash types, lengths and warranties (which actually protect the end users, not the certificate owner). It can be difficult to know what type of certificate is required for a particular application.

CSS-Security put out this high-level video about certificates and why they’re blooming in organizations (there is an ad at the end of the video about their product to help with certificate management).

Most companies still manage their certificates via a spreadsheet or some other manual process. That may be fine when you’re just getting started but it can quickly spiral out of control and addressing the problem may involve costs that are just not understood.

There are products and approaches to the enterprise certificate management. Automation tools can search a network and collect information all discovered certificates. They can assign certificates to systems and owners and manage automated renewal. These products can also check that the certificate was deployed correctly to avoid using an old certificate. Automated tools are only part of the answer and will require some manual intervention.

When purchasing one of these certificate management tools, ensure that the software can manage certificates from all CAs, since some will only manage certificates issued from a particular CA.

Thoughts from a discussion about architecture

evaluationYesterday, I had a long discussion with Stephen Heffner the creator of XTRAN (and president of XTRAN, LLC). XTRAN is a meta transformation tool that excels at automating software work – sorry that is the best description I could come up for a tool that can create solutions to analyze and translate between software languages, data structures and even project work products. When you first read about its capabilities it sounds like magic. There are numerous working examples available of its capabilities so you can see its usefulness for yourself.

He and I were talking about the issues and merits of Enterprise Architecture. He wrote piece titled, What is Enterprise Architecture?, where he describes his views on the EA function. Stephen identifies three major impediments to effective EA:

  • Conflating EA with IT
  • Aligning EA with just transformation
  • Thinking the EA is responsible for strategy

We definitely agreed that today’s perspective in most businesses that the EA function is embedded within IT does not align well with the strategic needs of the business. The role is much broader than IT and needs to embrace the broader business issues that IT should support.

I had a bit of problem with the EA alignment with transformation but that may just be a difference in context. One of the real killers of EA for me is the focus on work products and not outcomes. The EA should always have a focus on greater flexibility for the business, addressing rigor but not increasing rigidity. Rigidity is aligned with death – hence the term rigor mortis. To me, the EA function always has a transformational element.

The final point was that the EA supports strategy and the business needs to have a strategy. The EA is not the CEO and the CEO is probably not an EA.  The EA does need to understand the current state of the business environment though. I was talking with an analyst one day who told me that an EA needs to focus on the vision and they shouldn’t worry about a current situational assessment. My response was that “If you don’t know where you are, you’ll not be able to define a journey to where you need to be.” Stephen agreed with that perspective.

My view is that there are 4 main elements of an effective architecture:

  • People – Architecture lives at the intersection of business and technology. People live at the focus of that intersection, not technology. Architectural efforts should focus on the effect upon the people involved. What needs to happen? How will it be measured? These factors can be used to make course corrections along the way, once you realize: an architecture is never finished. If it doesn’t deliver as expected, change it. Make the whole activity transparent, so that people can buy in, instead of throw stones. My view is that if I am talking with someone about architecture and they don’t see its value, it is my fault.
  • Continuous change – When you begin to think of the business as dynamic and not static, the relationship with the real world becomes clear. In nature, those species that are flexible and adjust to meet the needs of the environment can thrive – those that can’t adjust die off.
    Architectures need to have standards, but it also needs to understand where compromises can be made. For example, Shadow IT It is better to understand and facilitate its effective use (through architecture), rather than try and stand in the way and get run over.
    In a similar way, the link between the agile projects and the overall architecture need to be recursive, building upon the understanding that develops. The architecture does not stand alone.
    Architecture development can also have short sprints of understanding, documenting and standardizing the technical innovations that take place, while minimizing technical debt.
  • Focus on business-goal based deliverables – Over the years, I’ve seen too many architectural efforts end up as shelf-ware. In the case of architecture, just-in-time is probably the most effective and accurate approach since the technology and business are changing continuously. Most organizations would just laugh at a 5 year technology strategy today, after all many of the technical trends are predictable. So I don’t mean you shouldn’t frame out a high-level one – just ‘don’t believe your own press’.
    If the architecture work products can be automated or at least integrated with the tooling used in the enterprise, it will be more accurate and useful. This was actually a concept that Stephen and I discussed in depth. The concept of machine and human readable work products should be part of any agile architecture approach.
    From a goal-based perspective, the architecture needs to understand at a fundamental level what is scarce for the organization and what is abundant and then maximize the value generated from what is scarce – or at least unique to the organization.
  • Good enough – Don’t let the perfect architecture stand in the way of one that is ‘good enough’ for today. All too often I’ve seen architecture analysis go down to 2 or 3 levels of detail. Then people say “if 2 is good, let’s go to 5 levels of depth.” Unfortunately, with each level of detail the cost to develop and maintain goes up by an order of magnitude – know when to stop. I’ve never seen a single instance of where these highly detailed architecture definitions where maintained more than 2 or 3 years, since they may actually cost as much to maintain as it took to create them. Few organizations have that level of intestinal fortitude to keep that up for long.
    The goal should be functional use, not a focus on perfection. Architecting the simplest solution what works today is generally best. If you architect the solution for something that will be needed 5 years out, either the underlying business need or the technical capabilities will change before it will actually be used.

None of this is really revolutionary. Good architects have been taking an approach like this for years. It is just easy to interpret some of the architecture process materials from an ivory tower (or IT only) perspective.

Enterprise architecture in a world of automated change

action 002I posted the other day on the Enterprise CIO blog an entry about the CIO’s role and the self-driving business that got me thinking about the Enterprise architect and the processes (e.g., TOGAF). There seems to be a lack of any real automation thread. Do you see one? This clearly needs to be addressed.

One of the primary roles of an Enterprise Architect is to identify, define and support business transformation projects. The capabilities of the technologies and the business drivers have changed quite dramatically in recent years but the processes in many ways remain the same. EA practitioners will need to take a very different approach to their role going forward and how it can shape the business.

Automation will be playing an ever increasing role in business. One concept that needs to be address is that of Enterprise Context Management, which is one of those foundational elements needed for automation, yet that’s not really part of any EA process work product – at least that I know of. To me this is like a repository of enterprise state (for lack of a better term) and who subscribes to the changes in state.

Gartner came up with the term of Vanguard Enterprise Architect, describing EAs that are focused on digital business techniques and its value to the business. As part of this more forward looking approach, architects need to understand that it’s not about creating documents but about blending people, process and system to meet business needs. Through the use of automation techniques the environment will still need to be human-centric, it will just use those individual’s attention more efficiently and effectively.

The days of EA’s gathering, documenting and then just placing a few recommendations on the table are likely over. EA is not about just hardware, software and projects. Sure those play a part but now it is services, relationships and a holistic ecosystem view aligned to desired outcomes. The expectation should be for the EA to deliver business outcomes, backed by contextual depth of impact and analytics that maximize the value from one of the scarcest resources in any business, the creativity of its people.

7 Questions to Help Look Strategically at IoT

question and analyticsThere are still many people who view the Internet of Things as focused on ‘the things’ and not the data they provide. Granted there are definitely some issues with the thing itself, but there are also concerns for enterprise, like the need to monitor the flow of information coming from these things, especially as we begin to automate the enterprise response to events.

A holistic perspective is needed and these are the top issues I believe an organization needs to think through when digging into their IoT strategy:

  1. What business value do the devices provide – independent of the data they collect?
    Having said that it is not really about the devices, it remains true that the devices should be delivering value in themselves – the data may be just a side effect of this role. Understanding those functions will increase the reliability and usefulness of the data over the long haul. No one wants to put an approach consuming a data stream just to have it dry up.
  2. What access will the devices have to the enterprise?
    Is it bi-directional? If it is the security risk of the devices is significantly higher than those that just provide raw data. If a positive feedback loop exists, it needs to be reinforced and secure. If the data flow is too narrow for this level of security, the need for bi-directional information flow needs to be scrutinized – if the interaction is that valuable, it really needs to be protected. Think about the issue of automotive data bus attacks, as an example.
  3. If attacked, how can the devices be updated?
    Does the devices support dynamic software updates and additions, if so how can those be delivered, by whom? Users of devices may download applications that contain malware, since it can be disguised as a game, security patch, utility, or other useful application. It is difficult for most to tell the difference between a legitimate application and one containing malware. For example, an application could be repackaged with malware and a consumer could inadvertently download it onto a device that is part of your IoT environment. Not all IoT devices are limited SCADA solutions, they may be smartphones, TVs… pretty much anything in our environment in the future.
  4. How will the data provided be monitored?
    Wireless data can be easily intercepted. When a wireless transmission is not encrypted, data can be easily intercepted by eavesdroppers, who may gain unauthorized access to sensitive information or derived behaviors. The same may be true of even a wired connection. Understanding the frequency of updates and shifts in data provided is usually an essential part of IoT’s value, and it should be part of the security approach as well.
  5. Can any personal or enterprise contextual information leak from the device connection?
    I blogged a while back about the issue of passive oversharing. As we enable more devices to provide information, we need to understand how that data flow can inadvertently build a contextual understanding about the business or the personnel and their behavior for other than the intended use.
  6. Is the device’s role in collecting information well-known and understood?
    No one like the thought of ‘big brother’ looking over their shoulder. People can easily feel offended or manipulated if a device enters their work environment and provides data they feel is ‘about them’ without their knowing this is taking place. A solid communications plan that keeps up with the changes in how the data is used will be a good investment.
  7. Who are all the entities that consume this data?
    As IoT data is used to provide a deeper contextual understanding of the environment, the contextual understanding may be shared with suppliers, partners and customers. These data flows need to be understood and tracked, like any consumer relationship, otherwise they may easily turn into a string of dominoes that enable unexpected shifts in results as they change. Awareness of enterprise context management will be growing in importance over the coming years – note that was not content management but context management.

All these issues are common to IT systems, but with an IoT deployment, the normal IT organization may only be able to influence how they are addressed.