Our people weigh in on the issues of the day.
Blue Slate's people think a lot about the challenges facing their industries today. In the process, they often come up with completely unexpected slants on current issues, or new ways of thinking about business problems. Bluespeak is where they share those thoughts. Feel free to read and reflect.
[Any views or opinion represented in this blog are personal and belong solely to the blogger and do not represent those of Blue Slate Solutions.]
I am excited to share the news that Blue Slate Solutions has kicked off a formal innovation program, creating a lab environment which will leverage the Cognitive Corporation™ framework and apply it to a suite of processes, tools and techniques. The lab will use a broad set of enterprise technologies, applying the learning organization concepts implicit in the Cognitive Corporation’s™ feedback loop.
I’ve blogged a couple of times (see references at the end of this blog entry) about the Cognitive Corporation™. The depiction has changed slightly but the fundamentals of the framework are unchanged.
The focus is to create a learning enterprise, where the learning is built into the system integrations and interactions. Enterprises have been investing in these individual components for several years; however they have not truly been integrating them in a way to promote learning.
By “integrating” I mean allowing the system to understand the meaning of the data being passed between them. Creating a screen in a workflow (BPM) system that presents data from a database to a user is not “integration” in my opinion. It is simply passing data around. This prevents the enterprise ecosystem (all the components) from working together and collectively learning.
I liken such connections to my taking a hand-written note in a foreign language, which I don’t understand, and typing the text into an email for someone who does understand the original language. Sure, the recipient can read it, but I, representing the workflow tool passing the information from database (note) to screen (email) in this case, have no idea what the data means and cannot possibly participate in learning from it. Integration requires understanding. Understanding requires defined and agreed-upon semantics.
This is just one of the Cognitive Corporation™ concepts that we will be exploring in the lab environment. We will also be looking at the value of these technologies within different horizontal and vertical domains. Given our expertise in healthcare, finance and insurance, our team is well positioned to use the lab to explore the use of learning BPM in many contexts.[Read More] Semantic Technology and Business Conference, East 2011 – Reflections
I had the pleasure of attending the Semantic Technology and Business Conference in Washington, DC last week. I have a strong interest in semantic technology and its capabilities to enhance the way in which we leverage information systems. There was a good selection of topics discussed by people with a variety of backgrounds working in different verticals.
To begin the conference I attended the half day “Ontology 101” presented by Elisa Kendall and Deborah McGuinness. They indicated that this presentation has been given at each semantic technology conference and the interest is still strong. The implication being that new people continue to want to understand this art.
Their material was very useful and if you are someone looking to get a grounding in ontologies (what are they? how do you go about creating them?) I recommend attending this session the next time it is offered. Both leaders clearly have deep experience and expertise in this field. Also, the discussion was not tied to a technology (e.g. RDF) so it was applicable regardless of underlying implementation details.
I wrapped up the first day with Richard Ordowich who discussed the process of reverse engineering semantics (meaning) from legacy data. The goal of such projects being to achieve a data harmonization of information across the enterprise.
A point he stressed was that a business really needs to be ready to start such a journey. This type of work is very hard and very time consuming. It requires an enterprise wide discipline. He suggests that before working with a company on such an initiative one should ask for examples of prior enterprise program success (e.g. something like BPM, SDLC).
Fundamentally, a project that seeks to harmonize the meaning of data across an enterprise requires organization readiness to go beyond project execution. The enterprise must put effective governance in place to operate and maintain the resulting ontologies, taxonomies and metadata.
The full conference kicked off the following day. One aspect that jumped out for me was that a lot of the presentations dealt with government-related projects. This could have been a side-effect of the conference being held in Washington, DC but I think it is more indicative that spending in this technology is more heavily weighted to public rather than private industry.
Being government-centric I found any claims of “value” suspect. A project can be valuable, or show value, without being cost effective. Commercial businesses have gone bankrupt even though they delivered value to their customers. More exposure of positive-ROI commercial projects will be important to help accelerate the adoption of these technologies.
Other than the financial aspect, the presentations were incredibly valuable in terms of presenting lessons learned, best practices and in-depth tool discussions. I’ll highlight a few of the sessions and key thoughts that I believe will assist as we continue to apply semantic technology to business system challenges.[Read More] Using ARQoid for Android-based SPARQL Query Execution
I was recently asked about the SPARQL support in Sparql Droid and whether it could serve as a way for other Android applications to execute SPARQL queries against remote data sources. It could be used in this way but there is a simpler alternative I’d like to discuss here.
On the Android platform it is actually quite easy to execute SPARQL against remote SPARQL endpoints, RDF data and local models. The heavy lifting is handled by Androjena’s ARQoid, an Android-centric port of HP’s Jena ARQ engine.
Both engines (the original and the port) do a great job of simplifying the execution of SPARQL queries and consumption of the resulting data. In this post I’ll go through a simple example of using ARQoid. Note that all the code being shown here is available for download. This post is based specifically on the queryRemoteSparqlEndpoint() method in the com.monead.androjena.demo.arqoid.SparqlExamples class.[Read More] The Cognitive Corporation™ – Effective BPM Requires Data Analytics
The Cognitive Corporation™ is a framework introduced in an earlier posting. The framework is meant to outline a set of general capabilities that work together in order to support a growing and thinking organization. For this post I will drill into one of the least mature of those capabilities in terms of enterprise solution adoption – Learn.
Business rules, decision engines, BPM, complex event processing (CEP), these all invoke images of computers making speedy decisions to the benefit of our businesses. The infrastructure, technologies and software that provide these solutions (SOA, XML schemas, rule engines, workflow engines, etc.) support the decision automation process. However, they don’t know what decisions to make.
The BPM-related components we acquire provide the how of decision making (send an email, route a claim, suggest an offer). Learning, supported by data analytics, provides a powerful path to the what and why of automated decisions (send this email to that person because they are at risk of defecting, route this claim to that underwriter because it looks suspicious, suggest this product to that customer because they appear to be buying these types of items).
I’ll start by outlining the high level journey from data to rules and the cyclic nature of that journey. Data leads to rules, rules beget responses, responses manifest as more data, new data leads to new rules, and so on. Therefore, the journey does not end with the definition of a set of processes and rules. This link between updated data and the determination of new processes and rules is the essence of any learning process, providing a key function for the cognitive corporation.[Read More] Expanding on “Code Reviews Trumps Unit Testing, But They Are Better Together”
Michael Delaney, a senior consulting software engineer at Blue Slate, commented on my previous posting.
As I created a reply I realized that I was expanding on my reasoning
and it was becoming a bit long. So, here is my reply as a follow-up
posting. Also, thank you to Michael for helping me think more about
I understand the desire to rely on unit testing and its ability to find issues and prevent regressions. For TDD, I’ll need to write separately. Fundamentally I’m a believer in white box testing. Black box approaches, like TDD, seem to be of relatively little value to the overall quality and reliability of the code. Meaning, I’d want to invest more effort in white box testing than in black box testing.
I’m somewhat jaded, being concerned with the code’s security, which to me is strongly correlated with its reliability. That said, I believe that unit testing is much more constrained as compared to formal reviews. I’m not suggesting that unit tests be skipped, rather that we understand that unit tests can catch certain types of flaws and that those types are narrow as compared to what formal reviews can identify.[Read More] Code Reviews Trump Unit Testing , But They Are Better Together
Last week I was participating in a formal code review (a.k.a. code inspection) with one of our clients. We have been working with this client, helping them strengthen their development practices. Holding formal code reviews is a key component for us. Part of the formal process we introduced includes reviewing the unit testing results, both the (successful) output report and the code coverage metrics.
At one point we were reviewing some code that had several error handling blocks that were not being covered in the unit tests. These blocks were, arguably, unlikely or impossible to reach (such as a Java StringReader throwing an IOException). There was some discussion by the team about the necessity of mocking enough functionality to cover these blocks.
Although we agreed that some of the more esoteric error conditions weren’t worth the programmer’s time to mock-up, it occurred to me later that we were missing an important point. What mattered was that we were holding a formal code review and looking at those blocks of code.[Read More] The Cognitive Corporation™ – An Introduction
Given my role as an enterprise architect, I’ve had the opportunity to work with many different business leaders, each focused on leveraging IT to drive improved efficiencies, lower costs, increase quality, and broaden market share throughout their businesses. The improvements might involve any subset of data, processes, business rules, infrastructure, software, hardware, etc. A common thread is that each project seeks to make the corporation smarter through the use of information technology.
As I’ve placed these separate projects into a common context of my own, I’ve concluded that the long term goal of leveraging information technology must be for it to support cognitive processes. I don’t mean that the computers will think for us, rather that IT solutions must work together to allow a business to learn, corporately.
The individual tools that we utilize each play a part. However, we tend to utilize them in a manner that focuses on isolated and directed operation rather than incorporating them into an overall learning loop. In other words, we install tools that we direct without asking them to help us find better directions to give.
Let me start with a definition: similar to thinking beings, a cognitive corporation™ leverages a feedback loop of information and experiences to inform future processes and rules. Fundamentally, learning is a process and it involves taking known facts and experiences and combining them to create new hypothesis which are tested in order to derive new facts, processes and rules. Unfortunately, we don’t often leverage our enterprise applications in this way.[Read More] Fuzzing – A Powerful Technique for Software Security Testing
It is unexpected input that is useful when looking to find untested paths through the code. If someone shows me an application for evaluation the last thing I need to worry about is using it in an expected fashion, everyone else will do that. In fact, I default to entering data outside the specification when looking at a new application. I don’t know that my team always appreciates the approach. They’d probably like to see the application work at least once while I’m in the room.
These days there is a formal name for testing of this type, fuzzing. A few years ago I preferred calling it “gorilla testing” since I liked the mental picture of beating on the application. (Remember the American Tourister luggage ad in the 1970s?) But alas, it appears that fuzzing has become the accepted term.
Fuzzing involves passing input that breaks the expected input “rules”. Those rules could come from some formal requirements, such as a RFC, or informal requirements, such as the set of parameters accepted by an application. Fuzzing tools can use formal standards, extracted patterns and even randomly generated inputs to test an applications resilience against unexpected or illegal input.[Read More] How I Spent My Christmas Vacation
(or Upgrading to Android and Windows 7)
The holidays are usually a time I can use to catch-up on some extra reading or research. This year I had two major infrastructure changes that occupied my time. I moved from my Blackberry Storm to an HTC Incredible and from my old Gateway M680 with Windows XP to a Dell Vostro 3700 running Windows 7. It has been a bumpy couple of weeks getting my virtual life back in order.
Before getting into some of the details of the experiences, I’ll summarize by saying that both upgrades were worth the learning curve and associated frustration. The Incredible’s hardware and the Android OS are orders-of-magnitude beyond the Storm in terms of usability, reliability, and functionality. On my computer, Windows 7 (64-bit professional version) provides a clean and efficient environment. The compatibility with 32-bit applications has worked flawlessly so far.[Read More] CIO, a Role for Two
Actors often enjoy the challenge of a role that requires two completely different personas to be presented. Jekyll and Hyde, Peter Pan’s Captain Hook and Mr. Darling as well as The Prince and the Pauper all give an actor the chance to play two different people within the same role. In the case of CIOs, they are cast in a role that has a similar theme, requiring two very different mindsets.
For the CIO, this duality is described in a variety of ways. Sometimes the CIO’s job requirements are discussed as internally and externally focused. In other cases people separate the responsibilities into infrastructure and business.
Regardless of how the aspects are expressed, there is an understanding that the CIO provides leadership in two different realms. One realm is focused on keeping equipment operating, minimizing maintenance costs, achieving SLAs and allowing the business to derive value from IT investments. The other realm focuses on business strategy and seeks to derive new functionality in support of improved productivity, customer service, profitability and other corporate measures.
By analogy, the first realm keeps the power flowing while the second creates new devices to plug in and do work.
One could argue that a rethinking of corporate structure might help simplify this situation. After all, we don’t charge the CFO with maintaining the infrastructure around financial systems, including file cabinets, door locks and computer hardware. Why should a person charged with exploiting computers for the benefit of the corporation also be charged with the maintenance of the computer hardware and software? Couldn’t the latter responsibility be provided by an operations group, similar to the handling of most utilities?