Voice recognition project completed at UTD

Every semester I try and work with some students at UTD by facilitating a ‘capstone’ project. It’s another dimension of my support for STEM education.education2 Yesterday, they gave their presentation to their professor and class.

This semester the project was creating an Android based speech recognition solution to facilitate a Voice-based Inspection and Evaluation Framework. We shied away from using Google’s speech recognition, since we wanted off-line capabilities, as well as enhanced security/privacy. Addressing this expectation was one of the first issues the team had to conquer.

They were able to identify and implement an open source library providing the speech recognition (PocketSphinx). They also used Android.Speech.tts for text-to-speech interaction with the user.

The team created a visual programming environment to graphically define a flowchart and export that to an XML file that the mobile device was able to use to facilitate the inspection process. The mobile application could have a number of these stored for later use.

The end product was able to handle a range of speech recognition needs:

  • Yes/no
  • Answer from a list of valid responses (e.g., States)
  • Answer with a number (range checked)
  • Free form sound capture

Overall, I was very impressed with what these students were able to accomplish during the semester and the quality of the Software Life Cycle work products they were able to produce. Naturally, since we didn’t know exactly what they were going to be able to accomplish they used a modified agile approach – since they still had to produce the work products require for the class based on a predefined time table.  We incorporated the concept of designing specific sprints around producing those work products as well as the typical need to define, document and validate requirements.

I started the project while working at HP and Dave Gibson and Cliff Wilke helped facilitate it to the end (they are still with HP).

Advertisements

Security certificate maintenance – there must be a better way

Broken-chainOver the last few years, I’ve seen numerous instances where will maintained systems that are run by organizations with good operational records have fallen over, caused by security certificate expiration.

Just last week, Google Mail went down for a significant time when their security key chain broke (note Google’s use of SHA-1 internally – but that’s a whole other issue). Gmail is a solution that is core to an increasing % of the population, schools and businesses. Most people likely believe that Google operations are well run and world class – yet they stumbled in the same way that I’ve seen many others before.

A reliable and rigorous approach is needed for organizations to track their certificate chains that proactively warns the organization before they expire, since it will take hours to repair them once they break. There are many critical tasks that come with certificate management, and ignoring or mishandling any one of them can set the stage for Web application exploits or system downtime.

These certificates (which contain the keys) are the cornerstone to the organization’s cryptography-based defense. As the market-facing application portfolio of an organization expands, the number of certificates will also expand and the key chains can get longer with more convoluted interrelationships as well (especially if not planned and just allowed to evolve). Additionally, the suite of certificate products from vendors can be confusing. There are different levels of validation offered, numerous hash types, lengths and warranties (which actually protect the end users, not the certificate owner). It can be difficult to know what type of certificate is required for a particular application.

CSS-Security put out this high-level video about certificates and why they’re blooming in organizations (there is an ad at the end of the video about their product to help with certificate management).

Most companies still manage their certificates via a spreadsheet or some other manual process. That may be fine when you’re just getting started but it can quickly spiral out of control and addressing the problem may involve costs that are just not understood.

There are products and approaches to the enterprise certificate management. Automation tools can search a network and collect information all discovered certificates. They can assign certificates to systems and owners and manage automated renewal. These products can also check that the certificate was deployed correctly to avoid using an old certificate. Automated tools are only part of the answer and will require some manual intervention.

When purchasing one of these certificate management tools, ensure that the software can manage certificates from all CAs, since some will only manage certificates issued from a particular CA.

Thoughts from a discussion about architecture

evaluationYesterday, I had a long discussion with Stephen Heffner the creator of XTRAN (and president of XTRAN, LLC). XTRAN is a meta transformation tool that excels at automating software work – sorry that is the best description I could come up for a tool that can create solutions to analyze and translate between software languages, data structures and even project work products. When you first read about its capabilities it sounds like magic. There are numerous working examples available of its capabilities so you can see its usefulness for yourself.

He and I were talking about the issues and merits of Enterprise Architecture. He wrote piece titled, What is Enterprise Architecture?, where he describes his views on the EA function. Stephen identifies three major impediments to effective EA:

  • Conflating EA with IT
  • Aligning EA with just transformation
  • Thinking the EA is responsible for strategy

We definitely agreed that today’s perspective in most businesses that the EA function is embedded within IT does not align well with the strategic needs of the business. The role is much broader than IT and needs to embrace the broader business issues that IT should support.

I had a bit of problem with the EA alignment with transformation but that may just be a difference in context. One of the real killers of EA for me is the focus on work products and not outcomes. The EA should always have a focus on greater flexibility for the business, addressing rigor but not increasing rigidity. Rigidity is aligned with death – hence the term rigor mortis. To me, the EA function always has a transformational element.

The final point was that the EA supports strategy and the business needs to have a strategy. The EA is not the CEO and the CEO is probably not an EA.  The EA does need to understand the current state of the business environment though. I was talking with an analyst one day who told me that an EA needs to focus on the vision and they shouldn’t worry about a current situational assessment. My response was that “If you don’t know where you are, you’ll not be able to define a journey to where you need to be.” Stephen agreed with that perspective.

My view is that there are 4 main elements of an effective architecture:

  • People – Architecture lives at the intersection of business and technology. People live at the focus of that intersection, not technology. Architectural efforts should focus on the effect upon the people involved. What needs to happen? How will it be measured? These factors can be used to make course corrections along the way, once you realize: an architecture is never finished. If it doesn’t deliver as expected, change it. Make the whole activity transparent, so that people can buy in, instead of throw stones. My view is that if I am talking with someone about architecture and they don’t see its value, it is my fault.
  • Continuous change – When you begin to think of the business as dynamic and not static, the relationship with the real world becomes clear. In nature, those species that are flexible and adjust to meet the needs of the environment can thrive – those that can’t adjust die off.
    Architectures need to have standards, but it also needs to understand where compromises can be made. For example, Shadow IT It is better to understand and facilitate its effective use (through architecture), rather than try and stand in the way and get run over.
    In a similar way, the link between the agile projects and the overall architecture need to be recursive, building upon the understanding that develops. The architecture does not stand alone.
    Architecture development can also have short sprints of understanding, documenting and standardizing the technical innovations that take place, while minimizing technical debt.
  • Focus on business-goal based deliverables – Over the years, I’ve seen too many architectural efforts end up as shelf-ware. In the case of architecture, just-in-time is probably the most effective and accurate approach since the technology and business are changing continuously. Most organizations would just laugh at a 5 year technology strategy today, after all many of the technical trends are predictable. So I don’t mean you shouldn’t frame out a high-level one – just ‘don’t believe your own press’.
    If the architecture work products can be automated or at least integrated with the tooling used in the enterprise, it will be more accurate and useful. This was actually a concept that Stephen and I discussed in depth. The concept of machine and human readable work products should be part of any agile architecture approach.
    From a goal-based perspective, the architecture needs to understand at a fundamental level what is scarce for the organization and what is abundant and then maximize the value generated from what is scarce – or at least unique to the organization.
  • Good enough – Don’t let the perfect architecture stand in the way of one that is ‘good enough’ for today. All too often I’ve seen architecture analysis go down to 2 or 3 levels of detail. Then people say “if 2 is good, let’s go to 5 levels of depth.” Unfortunately, with each level of detail the cost to develop and maintain goes up by an order of magnitude – know when to stop. I’ve never seen a single instance of where these highly detailed architecture definitions where maintained more than 2 or 3 years, since they may actually cost as much to maintain as it took to create them. Few organizations have that level of intestinal fortitude to keep that up for long.
    The goal should be functional use, not a focus on perfection. Architecting the simplest solution what works today is generally best. If you architect the solution for something that will be needed 5 years out, either the underlying business need or the technical capabilities will change before it will actually be used.

None of this is really revolutionary. Good architects have been taking an approach like this for years. It is just easy to interpret some of the architecture process materials from an ivory tower (or IT only) perspective.