Our people weigh in on the issues of the day.
Blue Slate's people think a lot about the challenges facing their industries today. In the process, they often come up with completely unexpected slants on current issues, or new ways of thinking about business problems. Bluespeak is where they share those thoughts. Feel free to read and reflect.
[Any views or opinion represented in this blog are personal and belong solely to the blogger and do not represent those of Blue Slate Solutions.]
I am excited to share the news that Blue Slate Solutions has kicked off a formal innovation program, creating a lab environment which will leverage the Cognitive Corporation™ framework and apply it to a suite of processes, tools and techniques. The lab will use a broad set of enterprise technologies, applying the learning organization concepts implicit in the Cognitive Corporation’s™ feedback loop.
I’ve blogged a couple of times (see references at the end of this blog entry) about the Cognitive Corporation™. The depiction has changed slightly but the fundamentals of the framework are unchanged.
The focus is to create a learning enterprise, where the learning is built into the system integrations and interactions. Enterprises have been investing in these individual components for several years; however they have not truly been integrating them in a way to promote learning.
By “integrating” I mean allowing the system to understand the meaning of the data being passed between them. Creating a screen in a workflow (BPM) system that presents data from a database to a user is not “integration” in my opinion. It is simply passing data around. This prevents the enterprise ecosystem (all the components) from working together and collectively learning.
I liken such connections to my taking a hand-written note in a foreign language, which I don’t understand, and typing the text into an email for someone who does understand the original language. Sure, the recipient can read it, but I, representing the workflow tool passing the information from database (note) to screen (email) in this case, have no idea what the data means and cannot possibly participate in learning from it. Integration requires understanding. Understanding requires defined and agreed-upon semantics.
This is just one of the Cognitive Corporation™ concepts that we will be exploring in the lab environment. We will also be looking at the value of these technologies within different horizontal and vertical domains. Given our expertise in healthcare, finance and insurance, our team is well positioned to use the lab to explore the use of learning BPM in many contexts.[Read More] The Cognitive Corporation™ – Effective BPM Requires Data Analytics
The Cognitive Corporation™ is a framework introduced in an earlier posting. The framework is meant to outline a set of general capabilities that work together in order to support a growing and thinking organization. For this post I will drill into one of the least mature of those capabilities in terms of enterprise solution adoption – Learn.
Business rules, decision engines, BPM, complex event processing (CEP), these all invoke images of computers making speedy decisions to the benefit of our businesses. The infrastructure, technologies and software that provide these solutions (SOA, XML schemas, rule engines, workflow engines, etc.) support the decision automation process. However, they don’t know what decisions to make.
The BPM-related components we acquire provide the how of decision making (send an email, route a claim, suggest an offer). Learning, supported by data analytics, provides a powerful path to the what and why of automated decisions (send this email to that person because they are at risk of defecting, route this claim to that underwriter because it looks suspicious, suggest this product to that customer because they appear to be buying these types of items).
I’ll start by outlining the high level journey from data to rules and the cyclic nature of that journey. Data leads to rules, rules beget responses, responses manifest as more data, new data leads to new rules, and so on. Therefore, the journey does not end with the definition of a set of processes and rules. This link between updated data and the determination of new processes and rules is the essence of any learning process, providing a key function for the cognitive corporation.[Read More] The Cognitive Corporation™ – An Introduction
Given my role as an enterprise architect, I’ve had the opportunity to work with many different business leaders, each focused on leveraging IT to drive improved efficiencies, lower costs, increase quality, and broaden market share throughout their businesses. The improvements might involve any subset of data, processes, business rules, infrastructure, software, hardware, etc. A common thread is that each project seeks to make the corporation smarter through the use of information technology.
As I’ve placed these separate projects into a common context of my own, I’ve concluded that the long term goal of leveraging information technology must be for it to support cognitive processes. I don’t mean that the computers will think for us, rather that IT solutions must work together to allow a business to learn, corporately.
The individual tools that we utilize each play a part. However, we tend to utilize them in a manner that focuses on isolated and directed operation rather than incorporating them into an overall learning loop. In other words, we install tools that we direct without asking them to help us find better directions to give.
Let me start with a definition: similar to thinking beings, a cognitive corporation™ leverages a feedback loop of information and experiences to inform future processes and rules. Fundamentally, learning is a process and it involves taking known facts and experiences and combining them to create new hypothesis which are tested in order to derive new facts, processes and rules. Unfortunately, we don’t often leverage our enterprise applications in this way.[Read More] Fuzzing – A Powerful Technique for Software Security Testing
It is unexpected input that is useful when looking to find untested paths through the code. If someone shows me an application for evaluation the last thing I need to worry about is using it in an expected fashion, everyone else will do that. In fact, I default to entering data outside the specification when looking at a new application. I don’t know that my team always appreciates the approach. They’d probably like to see the application work at least once while I’m in the room.
These days there is a formal name for testing of this type, fuzzing. A few years ago I preferred calling it “gorilla testing” since I liked the mental picture of beating on the application. (Remember the American Tourister luggage ad in the 1970s?) But alas, it appears that fuzzing has become the accepted term.
Fuzzing involves passing input that breaks the expected input “rules”. Those rules could come from some formal requirements, such as a RFC, or informal requirements, such as the set of parameters accepted by an application. Fuzzing tools can use formal standards, extracted patterns and even randomly generated inputs to test an applications resilience against unexpected or illegal input.[Read More] Semantic Web Summit (East) 2010 Concludes
I attended my first semantic web conference this week, the Semantic Web Summit (East) held in Boston. The focus of the event was how businesses can leverage semantic technologies. I was interested in what people were actually doing with the technology. The one and a half days of presentations were informative and diverse.
Our host was Mills Davis, a name that I have encountered frequently during my exploration of the semantic web. He did a great job of keeping the sessions running on time as well as engaging the audience. The presentations were generally crisp and clear. In some cases the speaker presented a product that utilizes semantic concepts, describing its role in the value chain. In other cases we heard about challenges solved with semantic technologies.
My major takeaways were: 1) semantic technologies work and are being applied to a broad spectrum of problems and 2) the potential business applications of these technologies are vast and ripe for creative minds to explore. This all bodes well for people delving into semantic technologies since there is an infrastructure of tools and techniques available upon which to build while permitting broad opportunities to benefit from leveraging them.[Read More] JavaOne 2010 Concludes
My last two days at JavaOne 2010 included some interesting sessions as well as spending some time in the pavilion. I’ll mention a few of the session topics that I found interesting as well as some of the products that I intend to check out.
I attended a session on creating a web architecture focused on high-performance with low-bandwidth. The speaker was tasked with designing a web-based framework for the government of Ethiopia. He discussed the challenges that are presented by that country’s infrastructure – consider network speed on the order of 5Kbps between sites. He also had to work with an IT group that, although educated and intelligent, did not have a lot of depth beyond working with an Oracle database’s features.
His solution allows developers to create fully functional web applications that keep exchanged payloads under 10K. Although I understand the logic of the approach in this case, I’m not sure the technique would be practical in situations without such severe bandwidth and skill set limitations.
A basic theme during his talk was to keep the data and logic tightly co-located. In his case it is all located in the database (PL/SQL) but he agreed that it could all be in the application tier (e.g. NoSQL). I’m not convinced that this is a good approach to creating maintainable high-volume applications. It could be that the domain of business applications and business verticals in which I often find myself differ from the use cases that are common to developers promoting the removal of tiers from the stack (whether removing the DB server or the mid-tier logic server).
One part of his approach with which I absolutely concur is to push processing onto the client. The use of the client’s CPU seems common sense to me. The work is around balancing that with security and bandwidth. However, it can be done and I believe we will continue to find more effective ways to leverage all that computer power.[Read More]