// JSON-LD for Wordpress Home, Articles and Author Pages. Written by Pete Wailes and Richard Baxter. // See: http://builtvisible.com/implementing-json-ld-wordpress/

Posts Tagged ‘Information Systems’

Business Rules Forum 2009 Winds Down

Friday, November 6th, 2009

With the vendors gone the main hall seemed spacious during the morning keynote and lunch time presentations.  James Taylor [of "Smart (Enough) Systems" fame] delivered the keynote address.  He always has interesting insights regarding the state of affairs for agile systems design, both leveraging automated decisioning and workflow processes. 

James made the point that systems need to be more agile given higher levels of uncertainty with which our businesses deal.  The need to adjust and react is more critical as our business strategies and goals flex to the changing environment.  Essentially he seemed to be saying that businesses should not reduce their efforts to be agile during this economic downturn.  Rather, it is more important to increase agility in order to respond quickly to shifting opportunities.

Following the keynote I attended Brian Dickinson’s session titled, “Business Event Driven Enterprises Rule!”  The description of the session in the conference guide had caught my attention since it mentioned “event partitioning” which was a phrase I had not heard used in terms of designing automated solutions for businesses.

I was glad that I went.  Brian was an energetic speaker!  It was clear that he was deeply committed and passionate about focusing on events rather than functionality when considering process automation.  The hour-long session flew by and it was apparent that we hadn’t made a dent in what he really wanted to communicate.

Brian was kind enough to give attendees a copy of his book, “Creating Customer Focused Organizations” which I hope expands on the premise of his presentation today.  Although quite natural as a design paradigm when building event-driven UI’s and multi-threaded applications, I have not spent time focused on events when designing the business and database tiers of applications.  For me, the first step of working with his central tenants will be to try applying them, at least in a small way, on my next architecture project. (more…)

Business Rules Forum: In the Groove

Thursday, November 5th, 2009

The second day of the BRF is typically the most active.  People are arriving throughout day 1 and start heading out on day 3.  I’m attending RuleML, which follows on the heels of the BRF, so I’ll be here for all if it.

The morning keynote was delivered by Stephen Hendrick (IDC).  His presentation was titled, “BRMS at a Cross Roads: the Next Five Years.”  It was interesting hearing his vision of how BRMS vendors will need to position their offerings in order to be relevant for the future needs of businesses.

I did find myself wondering whether his vision was somewhat off in terms of timing.  The move to offer unified (or at least integrated) solutions based on traditional BRMS, Complex Event Processing, Data Transformation and Analytics seemed well beyond where I find many clients are in terms of leveraging the existing BRMS capabilities.

Between discussions with attendees and work on projects for which Blue Slate’s  customers hire us, the current state of affairs seems to be more about understanding how to begin using a BRMS.  I find many clients are just getting effective governance, rules harvesting and infrastructure support for BRMS integration started.  Discussions surrounding more complex functionality are premature for these organizations.

As usual, there were many competing sessions throughout the day that I wanted to attend.  I had to choose between these and spending some in-depth time with a few of the vendors.  One product that I really wanted to get a look at was JBoss Rules (Red Hat).

Similar to most Red Hat offerings there are free and fee-based versions of the product.  Also, as is typical between the two versions, the fee-based version is aimed at enterprises that do not want to deal with experimental or beta aspects of the product, instead preferring a more formal process of periodic production-worthy upgrades.  The fee-based offering also gets you support, beyond the user groups available to users of the free version.

The naming of the two versions is not clear to me.  I believe that the fee-based version is called JBoss Rules while the free download is called JBoss Drools, owning to the fact that Red Hat used drools as the basis for its rule engine offering.  The Drools suite includes BPM, BRM and Event Processing components.  My principle focus was the BRMS to start.

The premier open source rules offering (my opinion) has come a long way since I last tried it over a year ago.  The feature set includes a version control repository for the rules, somewhat user-friendly rule editing forms and a test harness.  Work is underway to support templating for rules, which is vital for creating rules that can be maintained easily by business users.  I will be downloading and working with this rule engine again shortly! (more…)

Business Rules Forum: Full Fledged Kickoff!

Wednesday, November 4th, 2009

Today the Business Rules Forum (BRF) kicked off for its 12th year.  Gladys Lam welcomed us and set the stage for an enlightening and engaging three days.  Jim Sinur (Gartner) gave the keynote address.  His expertise surrounding the entire field of Business Process Management (BPM), Business Rules Management (BRM) and Complex Event Processing (CEP) gives him significant insight into the industry and trends.

Jim’s talk was a call to action for product vendors and practitioners that the world has changed fundamentally and being able to leverage what he called “weak signals” and myriad events from many sources was becoming a requirement for successful business operations.  As always his talk was accompanied with a little humor and a lot of excellent supporting material.

During the day I attended three sessions and two of the vendor “Fun Labs”.  For me, the most interesting session of the ones I attended was given by Graham Witt (Ajlion).  He discussed his success with creating an approach of allowing business users to document rules using a structured natural language.  His basis was SBVR, but he reduced the complexity to create a practical solution.

Graham did a great job of walking us through a set of definitions for fact model, term, fact types and so forth. Using our understanding of the basic components of a structured rule he explored how one can take ambiguous statements, leverage the structure inherent in the fact model, and create an unambiguous statement that was still completely understandable to the business user.

His approach of creating templates for each type of rule made sense as a very effective method to give the business user the flexibility of expressing different types of rules while staying within a structured syntax.  This certainly seems like an approach to be explored for getting us closer to a DRY (Don’t Repeat Yourself) process that moves rules from the requirements into the design and implementation phases of a rules-based project.

The vendor labs were also interesting.  I attended one run by Innovations Software Technology and another by RuleArts. (more…)

Business Rules Forum Tutorials: Analytics and Events

Tuesday, November 3rd, 2009

This was the second of two pre-conference days offering a set of interesting tutorial sessions.  Although the choices were tough, I decided on Eric Siegel’s and David Luckham’s sessions.  Both were thought provoking.

Eric’s session, “Driving Decisions with Predictive Analytics: The Top Seven Business Applications” caught my attention due to its focus on analytics.  I have taken two data analysis courses as part of the Master’s program at Union Graduate College.  The courses, “Systems Modeling and Optimization” and “Data Mining” really piqued my interest in this field.

What was different about Eric’s presentation was its focus on real-world use of these techniques.  Understandably, he could not delve into the detail of a semester-long course.  He did a great job of introducing the basic concepts of data mining and then explored how these can be leveraged to build models that can then be used to drive business decisions.

Beyond explaining the basics around creating models (formatting data, algorithm choices, training, testing) he discussed how the resulting model isn’t a magic bullet that will generate business rules.  Rather from the model comes the ability to make decision, but those decisions must be created by the business.

I believe that leveraging predictive analytics will continue to grow as a key differentiator for businesses and a key feature leveraged in business rule engines.  Having a keen interest in this area, I look forward to assisting businesses derive value from the growing set of analytical tools and techniques.

My afternoon session choice, delivered by David Luckham, was titled, “Complex Event Processing in An Event-Driven, Information World.”  Complex Event Processing (CEP) is not an area with which I am familiar and David’s presentation covered a broad cross-section of the field.

Professor Luckham (Emeritus) at Stanford has an amazing amount of knowledge regarding CEP.  He discussed its market, history, technology and his predictions for its future.  He flew through several presentations that make up a CEP course he teaches.  Given the amount of material he has on the topic, he allowed us to help tune his presentation based on our particular interests.

It is clear that he has a passion around CEP and a strong belief that it will grow into a core, hence transparent, feature of all service-based networks.  He refers to this end state as “Holistic Event Processing”(HEP).

The power of the platform he describes would be amazing.  Although he did not compare the vision to Mashups and environments such as Yahoo Pipes, the power of HEP would seem to extend and go well beyond the operation of those tools.

It will be interesting to see how this field and the products being created become part of our standard enterprise infrastructure.  There is a long way to go before we reach David’s vision.

Tomorrow the Business Rules Forum launches in earnest with lots of presentations and vendor demonstrations.  I’m looking forward to a variety of interesting discussions as the week goes on.

8SXr9o9ArvMjyP

Net Neutrality: Is There a Reason for Concern?

Monday, October 12th, 2009

Lately the subject of net neutrality has garnered a lot of attention.  As businesses large and small create an ever increasing set of offerings that require lots of bandwidth there is concern that the Internet infrastructure may not be able to keep data flowing smoothly

The core of the Internet’s bandwidth is provided by a few businesses.  These businesses exist to make money.  Fundamentally, when demand exceeds supply the cost of the good or service goes up.  In this case those costs might appear as increased charges for access or a slowing of one company’s data transfer versus another.

As in many debates there are two extreme positions represented by individuals, companies and trade groups.  In this case the dimension being debated is whether there is a need to legislate a message-neutral Internet (Net Neutrality)

The meaning of being “neutral” is that all data flowing across the Internet would be given equal priority.  The data being accessed by a doctor reading a CAT scan from a health records system would receive the same priority as someone watching a YouTube video.

Although the debate surrounds whether net neutrality should be a requirement, the reasons for taking a position vary.  I’ll start with concerns being shared by those that want a neutral net to be guaranteed.

Why Net Neutrality is Important

The Internet has served as a large and level playing field.  With a very small investment, companies can create a web presence that allows them to compete as peers of companies many times their size.

Unlike the brick-and-mortar world where the location, size, inventory, staff and ambiance of a store have direct monetary requirements, a web site is limited by the creativity and effort of a small team with an idea.  If Amazon and Google had needed to create an infrastructure on par with Waldenbooks or IBM in order to get started they would have had a much tougher journey.

Data on the Internet should be equally accessible.  It should not matter who my Internet Service provider (ISP) is.  Nor should my ISPs commercial relationships have a bearing on the service they provide to me to access the services of my choice.

If I choose to use services from my ISP’s competitor I should have access equivalent to using my ISP’s offering.  For instance, if I choose to use Google’s portal versus my ISP’s portal, the data sent by Google must not be impeded in favor of customer’s requesting my ISP’s content.

Network discrimination would dismantle the fundamental design of the Internet.  One of the original design goals for the Internet was its ability to get data from one place to another without regard for the actual content.  In other words, the underlying transport protocol knows nothing about web pages, SSH sessions, videos, and Flash applications.  It is this service agnosticism that has allowed myriad services to be created without having to fundamentally reengineer the Internet backbone.

An infrastructure that used to only routinely carry telnet and UUCP sessions now pass all manner of protocols and data.  Adding a layer of discrimination would require altering this higher-level agnosticism, since it is typically through inspection of the payload that one would arrive at decisions regarding varying the level of service.  This would lead the Internet down a road away from its open standards-based design.

There are other specific arguments made in favor of ensuring net neutrality.  In my opinion most of them relate to one of these I’ve mentioned.  So why oppose the requirement of net neutrality? (more…)

Why Do So Many Information Systems Implementations Fail and What Can Be Done to Improve Our Success Rate?

Thursday, May 7th, 2009

Information Systems (IS) implementations normally fail due to a lack of ownership, planning and execution by the organization.  The software and hardware tend to do what they are supposed to do.  Their features and limitations are typically known, at least if we take the time to investigate.  However, it is the organizational issues among and between business units and teams that actually require most of the effort when running an IS project.

The root causes of IS project failures include weak scoping, lack of executive management ownership, poor project management, lack of scope controls, improper infrastructure and inappropriate technology choices.

Weak scope leads to a project whose requirements are too broad to be met by any single system.  Similarly the team will be too broad with differing opinions as to the ultimate purpose of the project and therefore application.  After all, if the team members are each interpreting the goal(s) of the project in different ways it will be difficult, and time consuming, to arrive at consensus on each aspect of the project.

Lack of executive management ownership leaves the project team without an effective sponsor.  Having such a sponsor helps mitigate the various issues that will arise as the project team seeks to design and implement a new system.  Maintaining focus on the business goals of the system along with championing the system and breaking down barriers between groups are major functions for the executive owner.

Project management is key to delivering on any sort of solution, technology or otherwise.  Knowing the team roles, responsibilities, timelines and dependencies allows for issues to be identified proactively and resolved in a timely manner.  Exit strategies must be defined that rely on understanding the current project risks.  Without effective project management the actual status of the project remains hidden.

(more…)