// JSON-LD for Wordpress Home, Articles and Author Pages. Written by Pete Wailes and Richard Baxter. // See: http://builtvisible.com/implementing-json-ld-wordpress/

Posts Tagged ‘enterprise systems’

Database Refactoring and RDF Triples

Wednesday, May 12th, 2010

One of the aspects of agile software development that may lead to significant angst is the database.  Unlike refactoring code, the refactoring of the database schema involves a key constraint – state!  A developer may rearrange code to his or her heart’s content with little worry since the program will start with a blank slate when execution begins.  However, the database “remembers.”  If one accepts that each iteration of an agile process produces a production release then the stored data can’t be deleted as part of the next iteration.

The refactoring of a database becomes less and less trivial as project development continues.  While developers have IDE’s to refactor code, change packages, and alter build targets, there are few tools for refactoring databases.

My definition of a database refactoring tool is one that assists the database developer by remembering the database transformation steps and storing them as part of the project – e.g. part of the build process.  This includes both the schema changes and data transformations.  Remember that the entire team will need to reproduce these steps on local copies of the database.  It must be as easy to incorporate a peer’s database schema changes, without losing data, as it is to incorporate the code changes.

These same data-centric complexities exist in waterfall approaches when going from one version to the next.  Whenever the database structure needs to change, a path to migrate the data has to be defined.  That transformation definition must become part of the project’s artifacts so that the data migration for the new version is supported as the program moves between environments (test, QA, load test, integrated test, and production).  Also, the database transformation steps must be automated and reversible!

That last point, the ability to rollback, is a key part of any rollout plan.  We must be able to back out changes.  It may be that the approach to a rollback is to create a full database backup before implementing the update, but that assumption must be documented and vetted (e.g. the approach of a full backup to support the rollback strategy may not be reasonable in all cases).

This database refactoring issue becomes very tricky when dealing with multiple versions of an application.  The transformation of the database schema and data must be done in a defined order.  As more and more data is stored, the process consumes more storage and processing resources.  This is the ETL side-effect of any system upgrade.  Its impact is simply felt more often (e.g. potentially during each iteration) in an agile project.

As part of exploring semantic technology, I am interested in contrasting this to a database that consists of RDF triples.  The semantic relationships of data do not change as often (if at all) as the relational constructs.  Many times we refactor a relational database as we discover concepts that require one-to-many or many-to-many relationships.

Is an RDF triple-based database easier to refactor than a relational database?  Is there something about the use of RDF triples that reduces the likelihood of a multiplicity change leading to a structural change in the data?  If so, using RDF as the data format could be a technique that simplifies the development of applications.  For now, let’s take a high-level look at a refactoring use case.

(more…)

Business Ontologies and Semantic Technologies Class

Sunday, May 9th, 2010

Last week I had the pleasure of attending Semantic Arts’ training class entitled, “Designing and Building Business Ontologies.”  The course, led by Dave McComb and Simon Robe, provided an excellent introduction to semantic technologies and tools as well as coverage of ontological best practices.  I thoroughly enjoyed the 4-day class and achieved my principle goals in attending; namely to understand the semantic web landscape, including technologies such as RDF, RDFS, OWL, SPARQL, as well as the current state of tools and products in this space.

Both Dave and Simon have a deep understanding of this subject area.  They also work with clients using this technology so they bring real-world examples of where the technology shines and where it has limitations.  I recommend this class to anyone who is seeking to reach a baseline understanding of semantic technologies and ontology strategies.

Why am I so interested in semantic web technology?  I am convinced that structuring information such that it can be consumed by systems, in ways more automated than current data storage and association techniques allow, is required in order to achieve any meaningful advancement in the field of information technology (IT). Whether wiring together web services or setting up ETL jobs to create data marts, too much IT energy is wasted on repeatedly integrating data sources; essentially manually wiring together related information in the absence of the computer being able to wire it together autonomously!

(more…)

Full Disk Encryption – A (Close to Home) Case Study

Wednesday, April 28th, 2010

This is a follow-up to my previous entry regarding full disk encryption (see: http://monead.com/blog/?p=319).  In this entry I’ll look at Blue Slate’s experience with rolling out full disk encryption company-wide.

Blue Slate began experimenting with full disk encryption in 2008.  I was actually the first user at our company to have a completely encrypted disk.  My biggest surprise was the lack of noticeable impact on system performance.  My machine (Gateway M680) was running Windows XP and I had 2GB of RAM and a similarly-sized swap space.  Beyond a lot of programming work I do video and audio editing.  I did not notice significant impact on editing and rendering of such projects.

Later in 2008, we launched a proof of concept (POC) project involving team members from across the company (technical and non-technical users).  This test group utilized laptops with fully encrypted drives for several months.  We wanted to assure that we would not have problems with the various software packages that we use. During this time we went through XP service pack releases, major software version upgrades and even a switch of our antivirus solution.  We had no reports of encryption-related issues from any of the participants.

By 2009 we were focused on leveraging full disk encryption on every non-server computer in the company.  It took some time due to two constraints.

First, we needed to rollout a company-wide backup solution (as mentioned in my previous post on full disk encryption, recovery of files from a corrupted encrypted device is nearly impossible).  Second, we needed to work through a variety of scheduling conflicts (we needed physical access to each machine to setup the encryption product) across our decentralized workforce.

(more…)

Full Disk Encryption – Two Out of Three Aren’t Bad

Wednesday, April 14th, 2010

Security is a core interest of mine.  I have written and taught about security for many years; consistently keeping our team focused on secure solutions, and am in pursuit of earning the CISSP certification.  Some aspects of security are hard to make work effectively and other aspects are fairly simple, having more to do with common sense than technical expertise.

In this latter category I would put full disk encryption.  Clearly there are still many companies and individuals who have not embraced this technique.  The barrage of news articles describing lost and stolen computers containing sensitive information on unencrypted hard drives makes this point every day.

This leads me to the question of why people don’t use this technology.  Is it a lack of information, limitations in the available products or something else?  For my part I’ll focus this posting on providing information regarding full disk encryption, based on experience. A future post will describe Blue Slate’s deployment of full disk encryption.

Security focuses on three major concepts, Confidentiality, Integrity and Availability (CIA).  These terms apply across the spectrum of potential security-related issues.  Whether considering the physical environment, hardware, applications or data, there are techniques to protect the CIA within that domain.

(more…)

Business Rules Forum 2009 Winds Down

Friday, November 6th, 2009

With the vendors gone the main hall seemed spacious during the morning keynote and lunch time presentations.  James Taylor [of "Smart (Enough) Systems" fame] delivered the keynote address.  He always has interesting insights regarding the state of affairs for agile systems design, both leveraging automated decisioning and workflow processes. 

James made the point that systems need to be more agile given higher levels of uncertainty with which our businesses deal.  The need to adjust and react is more critical as our business strategies and goals flex to the changing environment.  Essentially he seemed to be saying that businesses should not reduce their efforts to be agile during this economic downturn.  Rather, it is more important to increase agility in order to respond quickly to shifting opportunities.

Following the keynote I attended Brian Dickinson’s session titled, “Business Event Driven Enterprises Rule!”  The description of the session in the conference guide had caught my attention since it mentioned “event partitioning” which was a phrase I had not heard used in terms of designing automated solutions for businesses.

I was glad that I went.  Brian was an energetic speaker!  It was clear that he was deeply committed and passionate about focusing on events rather than functionality when considering process automation.  The hour-long session flew by and it was apparent that we hadn’t made a dent in what he really wanted to communicate.

Brian was kind enough to give attendees a copy of his book, “Creating Customer Focused Organizations” which I hope expands on the premise of his presentation today.  Although quite natural as a design paradigm when building event-driven UI’s and multi-threaded applications, I have not spent time focused on events when designing the business and database tiers of applications.  For me, the first step of working with his central tenants will be to try applying them, at least in a small way, on my next architecture project. (more…)

Business Rules Forum Tutorials: Analytics and Events

Tuesday, November 3rd, 2009

This was the second of two pre-conference days offering a set of interesting tutorial sessions.  Although the choices were tough, I decided on Eric Siegel’s and David Luckham’s sessions.  Both were thought provoking.

Eric’s session, “Driving Decisions with Predictive Analytics: The Top Seven Business Applications” caught my attention due to its focus on analytics.  I have taken two data analysis courses as part of the Master’s program at Union Graduate College.  The courses, “Systems Modeling and Optimization” and “Data Mining” really piqued my interest in this field.

What was different about Eric’s presentation was its focus on real-world use of these techniques.  Understandably, he could not delve into the detail of a semester-long course.  He did a great job of introducing the basic concepts of data mining and then explored how these can be leveraged to build models that can then be used to drive business decisions.

Beyond explaining the basics around creating models (formatting data, algorithm choices, training, testing) he discussed how the resulting model isn’t a magic bullet that will generate business rules.  Rather from the model comes the ability to make decision, but those decisions must be created by the business.

I believe that leveraging predictive analytics will continue to grow as a key differentiator for businesses and a key feature leveraged in business rule engines.  Having a keen interest in this area, I look forward to assisting businesses derive value from the growing set of analytical tools and techniques.

My afternoon session choice, delivered by David Luckham, was titled, “Complex Event Processing in An Event-Driven, Information World.”  Complex Event Processing (CEP) is not an area with which I am familiar and David’s presentation covered a broad cross-section of the field.

Professor Luckham (Emeritus) at Stanford has an amazing amount of knowledge regarding CEP.  He discussed its market, history, technology and his predictions for its future.  He flew through several presentations that make up a CEP course he teaches.  Given the amount of material he has on the topic, he allowed us to help tune his presentation based on our particular interests.

It is clear that he has a passion around CEP and a strong belief that it will grow into a core, hence transparent, feature of all service-based networks.  He refers to this end state as “Holistic Event Processing”(HEP).

The power of the platform he describes would be amazing.  Although he did not compare the vision to Mashups and environments such as Yahoo Pipes, the power of HEP would seem to extend and go well beyond the operation of those tools.

It will be interesting to see how this field and the products being created become part of our standard enterprise infrastructure.  There is a long way to go before we reach David’s vision.

Tomorrow the Business Rules Forum launches in earnest with lots of presentations and vendor demonstrations.  I’m looking forward to a variety of interesting discussions as the week goes on.

8SXr9o9ArvMjyP

Anticipating the 2009 Business Rules Forum

Tuesday, September 22nd, 2009

The annual Business Rules Forum is right around the corner… starting on November 1.  For the third year Blue Slate has been invited to share our insights with the attendees.  I will have an opportunity to speak about the importance of viewing all data through the lens of  a company’s business rules.  The title of my talk is, ‘Business Rules in the Integration Tier: The System of Record‘.  It is scheduled for Wednesday, November 4 at 2pm (moved from 3:05pm).

I am excited and honored to be given another opportunity to speak at the preeminent conference for business rules.  Beyond sharing my thoughts I am looking forward to learning from the many practitioners that will be discussing their insights as well.  The variety of experts, topics and industries creates a valuable opportunity for anyone looking to begin or expand the use of rule-based approaches within his or her business.

In addition to the sessions, I highly recommend attending one or more of the “Fun Labs”.  They provide an opportunity to use the vendors’ products and get your questions answered.  The chance to explore these tools and see the entire process of creating, editing and running rules is powerful.

Read on for details about how this conference provides many great opportunities for learning about the techniques, tools and products that support effective application of rule-centric approaches.

About the Business Rules Forum Conference

**The only conference world-wide with all the vendors under one roof at one time!**

** Special 10% Conference Discount Courtesy of Blue Slate Solutions **

Use code “9SPDR” when you register

See details below

Have a look at this year’s program. Find out what the excitement is all about!

Download a copy of our new Conference Brochure featuring highlights of this year’s unparalleled event.

(more…)

Why Do So Many Information Systems Implementations Fail and What Can Be Done to Improve Our Success Rate?

Thursday, May 7th, 2009

Information Systems (IS) implementations normally fail due to a lack of ownership, planning and execution by the organization.  The software and hardware tend to do what they are supposed to do.  Their features and limitations are typically known, at least if we take the time to investigate.  However, it is the organizational issues among and between business units and teams that actually require most of the effort when running an IS project.

The root causes of IS project failures include weak scoping, lack of executive management ownership, poor project management, lack of scope controls, improper infrastructure and inappropriate technology choices.

Weak scope leads to a project whose requirements are too broad to be met by any single system.  Similarly the team will be too broad with differing opinions as to the ultimate purpose of the project and therefore application.  After all, if the team members are each interpreting the goal(s) of the project in different ways it will be difficult, and time consuming, to arrive at consensus on each aspect of the project.

Lack of executive management ownership leaves the project team without an effective sponsor.  Having such a sponsor helps mitigate the various issues that will arise as the project team seeks to design and implement a new system.  Maintaining focus on the business goals of the system along with championing the system and breaking down barriers between groups are major functions for the executive owner.

Project management is key to delivering on any sort of solution, technology or otherwise.  Knowing the team roles, responsibilities, timelines and dependencies allows for issues to be identified proactively and resolved in a timely manner.  Exit strategies must be defined that rely on understanding the current project risks.  Without effective project management the actual status of the project remains hidden.

(more…)