Wednesday, April 27, 2016

EA Week 15 Final thoughts

As always, hello! Welcome back, if you are returning! Thanks for reading, if this is your first time!

This posting is my final blog of the semester, but I have to say I have really enjoyed it! I want to thank anyone that has commented.  I really value the thoughts, ideas, and feedback that I have received.

As a final topic, I want to discuss the execution of metrics and tracking performance.  I had several comments around metrics, the value of performance tracking, and over commitment.  One classmate noted that one approach could use a "value-to-effort assessment for each metric. How will the collection of each metric increase workload?" You can find that discussion here




He was quite right. If we don't understand how deploying these new metrics will impact the data collection or the other efforts associated with IT and business operations, one could create a problem before getting started.  I feel that some of this would get flushed out during the project complexity assessment.  You can find more detail on project complexity here:

Yet, I still wanted to dive deeper on this. One way to understand the effort for a metric is to translate that into code and see where it falls apart.

For simplicity let's agree that a metric can be defined and a conceptual, logical representation reflected as such:




To better highlight this concept, take a look at this example:


This image highlights several items to consider. It takes both business and IT to produce a metric, much less EA performance metrics. A metric definition can be translated into code, at least pseudocode, very quickly in order to evaluate complexity.


As always, I welcome your comments and thoughts. It has been great working with everyone this semester and I especially thank all who have taken the time to read and comment!


Saturday, April 23, 2016

EA 873 Week 14 The Final Countdown

As always, hello! Welcome back, if you are returning! Thanks for reading, if this is your first time!

Over the course of the semester, I have been discussing different approaches to what amounts to artifact creation. For the last two postings, I thought I would focus on the next step in the cycle and discuss solution delivery. In week 3 I briefly touch on some of these items, but it is worth highlighting that again for this discussion.


In my business, we found we needed to build a strong team, establish guiding principles, and adopt a shared environment (people, process, and tools) when deploying new solutions for our clients. This need became more critical than what they were currently doing or how they wanted to evolve their processes and supporting solutions. Once a delivery framework is established, these principles and methods can be applied to individual business unit initiatives and projects across the enterprise. The role of EA is to facilitate the movement from current to future state. That change is managed by adopting a common delivery framework and supporting toolkit. This adoption, in turn, tends to reduce complexity and provides a means to deal with the diversity inherent in most business units. This framework should provide the ability to respond to changing business needs rapidly and efficiently.

In essence, the delivery process should facilitate the ability to


  • Understanding the objective and key initiatives 
  • Align those key initiatives and objectives
  • Execute activities 
  • Evaluate output for performance and ongoing improvement. 


Given this let's take the rollout of a CRM system as an example.
The following picture depicts this rollout on a swimlane. As you can see the swim lanes show participation from functional groups the last two are the continuous improvement/refinement steps.



Looking at the diagram, I hope it becomes clear why the artifacts discussed in my previous posts become so critical to the actual delivery process, look at the touchpoints across the organization.  Even at this high-level one can begin to see the challenges if, for instance, we did not have our SWOT storyboard to refer the group back to during this process.

WEEK 11

What about in the improvement/refinement steps? What if we lacked the GQM output I discussed last week?

WEEK 13


The Gartner article  "Advancing the Common Requirements Vision Deliverable" Retrieved April 23, 2016, from
https://online.ist.psu.edu/sites/ea872fusco/files/advancing_the_common_requirements.pdf

directly speaks of the need for an anchor model and using the business requirements to drive common solutions. It also talks about identifying core value adding processes improvement which tied to the improvement refinement step notated in the swimlane.

As I have mentioned the last couple of weeks,  I hope this starts to tie together my past posts into a more useful and bigger picture.

As always your comments are welcomed! I look forward to your thoughts on how you have seen project and solution delivery occur in your career.

Monday, April 18, 2016

EA 873 Week 13 Tracking and Measuring Performance


As always, hello! Welcome back, if you are returning! Thanks for reading, if this is your first time! I am a little late getting my post done this week  Life got in the way, but I guess that is what this is all about...balancing the two!


I was going over some readings this week and I was particularity intrigued with the Gartner presentation on Developing High-Impact EA Performance Metrics. This can be viewed here:

 http://www.gartner.com/document/1413013

The presentation touched on why we lose our way when it comes to strategy execution.  I have seen this in many forms and if there is one theme about my post this semester it has been that execution is crucial but only achievable with consensus and direction.  The article goes on to discuss that EA efforts should be guided by this strategic direction, becoming, essentially, the bridge between strategy and execution.  To act as this bridge, we must measure EA performance. This measurement should be through the use of metrics, giving EA and management the ability to track the overall performance of the EA efforts.

I can not agree more.  Yet in my experience, this breaks down because most organizations have difficulty defining, building and managing core metrics.  I talk about the process a little in my week five post here:

Week 5 Blog Post


I touch on the metric discussion briefly near the end and on utilizing Goal, Question, Metric or GQM.

To restate

Per, Basili, V. , et. al. The Goal Question Metric Approach, GQM is

“A technique that is based on the assumption that for an organization to measure in a purposeful way it must first specify the goals for itself and its projects, then it must trace those goals to the data that are intended to define those goals operationally, and finally provide a framework for interpreting the data with respect to the stated goals.”  

http://www.cs.umd.edu/~mvz/handouts/gqm.pdf

In other words, GQM is a hierarchical structure starting with a goal which specifies the purpose of measurement, object to be measured, issue to be measured, and viewpoint from which the measure is taken.

There are three levels to GQM.

Level 1 Conceptual (GOAL) defined for an object from various points of view relative to a particular environment:


  • Products: Artifacts, deliverables, and documents that are produced during the system life cycle; e.g. models, components, test suites.
  • Processes: Software related activities commonly associated with time; e.g. modeling, designing, testing.
  • Resources: Items used by the processes to produce their outputs; e.g. personnel, hardware, software, office space.


Level 2 Operational level (QUESTION) A set of questions used to characterize the way the assessment/achievement of a specific goal is going to be performed based on some characterizing model (the RBM).
These questions try to describe the object of the measurement (product, process, resource) on a selected quality issue and to determine its quality from a selected viewpoint.

Level 3 Quantitative level (METRIC)  Data associated with every question to answer it in a quantitative way.  The data can be:


  • Objective:  If they depend only on the object being measured and not on the viewpoint from which it is taken; e.g., number of versions of a document, staff hours spent on a task, size of a component.
  • Subjective:  If they depend on both the object that is being measured and the viewpoint from which they are taken; e.g., readability of a text, level of user satisfaction.


Here is a depiction of that structure that better represents the hierarchical structure



Now let us look at that in an example. I have tried to use something that can relate to project level work such as change request turn around time. Working from a current state to future state one of the things we identified in a SWOT is the ability to be reactive. As the business changes, so should the processes and improvements the EA efforts are overseeing.  Thus, we want to track improvements from the viewpoint of the project manager.




This provides a very measurable approach to the metric that everyone can agree to, measure and report the change over time.

Let me know what you think or if I am off base here. I would like to hear how others have built a consensus in their efforts around performance and tracking EA efforts.

Thanks again for reading!

Sunday, April 10, 2016

EA 873 Week 12 The Effectiveness Challenge

As always, hello! Welcome back, if you are returning! Thanks for reading, if this is your first time!

This week our readings were focused on governance. Governing, according to Gartner, “refers to the processes and organizational structure, along with their associated input and decision rights, that guide desirable enterprise behavior"  Reading went on to discuss touchpoints for EA and governance processes.  Artifacts and the management of artifacts becomes a critical part of the EA governance process.  As I have been doing the last couple of weeks, I wanted to look back into my personal experience for some context and insight that relates to the EA topic we are discussing this week.

Artifacts and governance remind me of a concept called Reuse-driven Software Engineering.  I have used this process as part of Center of Excellence (COE) efforts around Business Intelligence and Analytics.

Reuse can be thought of as a set of actions that maximizes the use of assets, processes and artifacts from one line of business (LOB) for the development and deployment of solutions to another LOB.  This approach seems simple conceptually, but it can be very complex and challenging to implement.  This implementation is where COE and EA governance efforts come into play.  Organizational models, new architectures, and processes will need to be introduced and managed/governed if reuse is to be effective.  When implemented effectively a continuous improvement oriented solution delivery model can be deployed as depicted below.




This delivery model can be characterized by being:

  • Customer-driven: Collaboratively developed with the business and aligned with IT.
  • High Value: Provided impact to the business that is aligned with goals and objectives.


In a COE scenario, this approach would utilize the prioritized business models as a primary means of governing and alignment capabilities. These capabilities, in turn, drive the solutions required by the business units.  In my opinion, this is how the EA team should approach governance of artifacts in the organization.


Here is a link related to this subject, while dated (2009), it suggests that maybe there is a connection between EA and Reuse.

http://www.architectingforum.org/whitepapers/SAF_WhitePaper_2009_9.pdf

Do you also see the COE management model as viable for EA artifact governance? As always I welcome your thoughts and comments!


Thanks again for reading. Again, I hope this starts to tie together my past posts into a more useful framework




Sunday, April 3, 2016

EA 873 Week 11 The Foundation for Execution

Hello and welcome back, if you are returning! Thanks for reading, if this is your first time!

A lot of this semester's focus has been on the different aspects of an EA program or initiative. In this blog, I am going to pull together some elements and discuss assessing and organizations foundations for execution. A lot of these thoughts have been a result of reading the book assigned to us for EA 872,

"Enterprise Architecture As Strategy" by Ross, Weill, and Robertson. 

The book discusses many warning signs of a company that doesn't have a foundation to support its strategy.  A foundation is critical to developing an increased effectiveness of Marketing, Sales & Service Operations from the perspective of the enterprise. It should provide for increased standardization within and across all operational centers from the standpoint of the leadership team. It must also improve the integration of strategic objectives from the Stakeholders’ perspective.

I've attempted to show this by utilizing one of the techniques discussed in my week five post, SWOT with storyboards;

http://joec3s.blogspot.com/2016/02/ea-872-week-5-getting-on-same-page.html

The SWOT analysis will show that Enterprise commitment to EA is critical for successful deployment and acceptance, thus creating a framework for execution.  We can then follow the issues that arise when looking to establish enterprise level programs like EA.  To recap, colors used below map as follows:

  • Strengths      = GREEN
  • Weakness     = YELLOW
  • Opportunity   = BLUE
  • Threats         = RED


EA Programs start with enterprise commitment

STRENGTH MUST start from the top with Executive ownership and Leadership.



EA Programs Promise Business Value Realization

This leads to OPPORTUNITIES THAT CAN  remove Silos ( Organization, Data, and Processes) which allows one to stay a step ahead and improve Competitive Positioning  and  Customer Service.


Obstacles to Success Exist within Enterprises

Major WEAKNESSES such as too much data/information or the incorrect type of information necessary to respond quickly and efficiently in a rapidly changing business environment typically sabotage opportunities.


The Price of Failure is High

The Price of failure is apparent. These THREATS lead to loss of revenue, decreased profitability and eventual loss of customers.


You can overcome

Determining  ‘the Right’ foundation for your EA program can mitigate these risks and help it succeed.



However, to do so, the foundation must take into account PROCESS, ORGANIZATION and ARCHITECTURE.


Thanks again for reading. Again, I hope this starts to tie together my past posts into a more useful framework.




Sunday, March 27, 2016

EA 872 Week 10 How to measure the gap

Hello! Thanks for reading! If you have been here, before welcome back!
This week we had readings dealing with Gap Analysis and Road Maps among other items.  I started discussing some aspects of Road Maps in my week 8 posting here

http://joec3s.blogspot.com/2016/03/ea-872-week-8-route-planning.html

I also wrote some details about the project complexity assessment technique here in week 6

http://joec3s.blogspot.com/2016/02/ea-872-week-6-low-hanging-fruit.html

For this week, I am going to continue along with on an item I touched on at the end of the post, project complexity and gap analysis. Project complexity analysis is an assessment technique that can be used during a gap analysis effort. When moving towards a future state architecture, it is important to understand the impact of any given project or process. Complexity Assessment is the process of determining the scope and initial timeline of any project or sub-project associated with future state effort. Each effort is assessed based on its Requirements Complexity and Environment Complexity.  Each complexity dimension uses multiple evaluation criteria to accomplish a sensitivity-based assessment.

Requirements Complexity is a measure of how complicated it is for a  delivery team to provide the solution to the business users.

Environment Complexity is the measure of how well the current state environment supports the expected needs of the solution.

The goal is to map an effort, based on complexity,  into a project type (1,2 or3). You can get more detail on the "Types" in the week six link noted above. To pull this all together and better understand this process I have attempted to break out the complexity elements as I might approach them during an assessment effort. Assume that this is a Business Intelligence reporting or data analytics effort.


Requirements Complexity
Requirements complexity as stated  is a measure of how complicated it is for a solution delivery team to provide the reporting solution to the business users.  The requirements complexity criteria are listed below.

Graphical User Interface
Customization of ‘out of the box’ functionality:

  • High – more than 20%
  • Medium – under 20%, but greater than 10%
  • Low – 10% or less

Analytics
How much analytic functionality is required in this solution?

  • High – requires advanced analytics, newly defined business process.
  • Medium – calls for advanced analytics, existing business process.
  • Low – a take it or leave it.

Security 
What is the expected security level required for this solution?

  • High – new security model required. 
  • Medium – role and data level security; could have regulatory issues.
  • Low – role-based, no expected issues.

Reuse Effort 
What is the expected reduction in effort due to the use of a reusable assets?

  • High – less than 50%
  • Medium – over 50%, but less than 80%
  • Low – 80% or more

Expected Deployment 
How many users are expected to be using the deployed solution?

  • High – over 500 
  • Medium – less than 500, but more than 50
  • Low – 50 or less



Environment Complexity
Environment Complexity as stated is the measure of how well the current deployment environment supports the expected needs of the solution. The environment complexity criteria are listed below.

Architecture Development
What is the expected need for new architecture components for deploying this solution?

  • High – new architecture required (source data – staging – repository – reporting) 
  • Medium – new repositories and/or data mart required
  • Low – existing deployment environment sufficient

Software & Hardware
What is the match with the currently deployed software and hardware within the development and deployment environments?

  • High – None, new software and/or hardware platforms required
  • Medium – some new hardware or software required, but not currently used in other deployed solutions
  • Low – current environment meets needs

Data Sources
How many new data sources are needed?

  • High – more than three new data sources required
  • Medium – 1 to 2 new data sources required, but sources not qualified and/or new transformations required against current data sources
  • Low – no more than two new data sources, but from qualified sources
Reports 
What is the adequacy of the current reporting capabilities?

  • High – new business reporting required
  • Medium – expanded reporting required, needs business validation
  • Low – current reports, interactive features only

Business Rules Development 
Are there existing business rules that support the reporting requirements of this solution?

  • High – new processes being developed, business rules dependent on completion
  • Medium – new business rules required for current or new reporting
  • Low – existing business rules used in current reporting

As I noted above getting to that project type is the ultimate goal of this effort. This classification can then be associated with timelines and cost thus adding tremendous value to the gap analysis efforts.
This information, in turn, helps ensure effective expectation management from the very early stages of project inception.

If you want to see something similar in action look at this link for the Canadian government. 

http://www.tbs-sct.gc.ca/hgw-cgf/oversight-surveillance/itpm-itgp/pm-gp/doc/pcra-ecrp-eng.asp

This approach is quantitative with the way it scores out and the added depth it takes. 

Thanks again for reading and as I stated last week I hope this starts to tie together these past posts into a more useful framework.


Sunday, March 20, 2016

EA 872 Week 9 Starting to pull it all together

Hello again! Welcome back from our short break and thanks for reading.
This week we have been discussing "Current State" documentation.  That's a fine segway from all the future vision discussions that we have been having these past couple of weeks.

There is no simple or direct method of establishing the current vision. In my opinion, this is where most of the trip ups occur. My reason for saying this is that it is not a strictly technical exercise. One can not just fire up a modeling tool of choice and run out to document the organization.  We can have great modelers and an understanding of architecture standards and supporting technology. But that only gets us so far. An EA practitioner must also understand the dynamics of the business and culture, so that outside influences can be determined and accounted for during this documentation effort.  If you are wondering why should you care as an architect...how else can you understand where to start? Should you begin with that one off database maintained by some LOB? What about the CRM system or the online ordering system? How do you get focused and know where to start?

Understanding the business is the first step. Do you understand the enterprise's missions and objectives? Does this match the CEO and other C-level executives goals? From there, that can lead to those critical solutions or areas of the business that you should direct your focus at to start. On your team do you have a resource with a thorough understanding of the company? If not, no amount of technology will help you from getting bogged down and lost with little value delivery.  Thus, to be effective at working through the current state and getting to some of the documentation we have discussed, one needs a supporting and repeatable framework to assist in this process.  The first part is to establish a set of questions that can pull relevant information from key stakeholders but can also be quantified.

One industry model is the Mike2.0 framework which can be found here

http://mike2.openmethodology.org/wiki/Information_Maturity_QuickScan


This provides a nice format to collect data and produce results that are quantifiable. with some nice Gartner type four quadrant graphs. My concern is that this might produce to much data and come across as overly complex in the final analysis, depending on the the audience.

If you look back on my blogs, you will see some references to defining key metrics and using responses to drive information acquisition.

One example can be found in Week 7 - Delivery Approaches

http://joec3s.blogspot.com/2016/02/ea-872-week-7-delivery-approaches.html

Another example is found in Week 5 - Getting on the Same Page

http://joec3s.blogspot.com/2016/02/ea-872-week-5-getting-on-same-page.html

I discuss two techniques that I believe are excellent at getting this relevant information in a quantitative way.  They are the SWOT with a storyboard and then GQM (Goal, Question, Metric).  This type of visual has proven to be effective with C-Level executives and yet provides information and details that can be quantifiable.  I will not repost those details again in this week's blog, but take a look back if you have a chance, and let me know what you think.

Hopefully, this post starts to tie together these items that I have been writing about these past weeks into a more useful framework.

It would be great to hear about what has helped you with useful current state analysis techniques.

Thanks for stopping by and as always I appreciate your thoughts and comments!









Sunday, March 6, 2016

EA 872 Week 8 Route planning

Over the last couple of weeks, I have been writing about different aspects of past work I've done and how that relates, in my opinion, to the academic and formal definition of EA and its principles. Based on some of the comments I have received, it seems that I have pointed to some items that others have found interesting. I know I have found the comments and discussion to be informative as well. At this stage, it would seem like it makes sense to keep on that path. Hey, if something is working for you , why change? Isn't that how the saying goes? How many folks have heard that during a project!? None the less, I'm going to keep on the same path this week and discuss roadmaps.

Roadmaps are an essential part of the Future Stage Architecture. A Roadmap is a visual representation of the future vision and an important artifact for executives and decision makers. For the team, it is a major component of the planning process.  How do we get to this map...the artifact that is intended to help us plan the route from current to the future state?  CIO magazine reports that the EA roadmap is the linchpin for transformation.

"Achieving a transformed ‘future state' requires a tool to guide and govern day-to-day resource investment decisions - the Enterprise Roadmap."

http://www.cio.com/article/2372268/enterprise-architecture/the-essential-ea-toolkit-part-4---an-enterprise-roadmap.html

So, how do we build this and what does it look like when completed?  In the past, as part of larger strategy and planning efforts, I have attempted to develop an approach that assists in this roadmap planning process.

This process starts with identifying potentially harvestable assets and using those to create a "candidate asset list." Then we would go through the following planning process:
  • List upcoming and existing projects.
  • Eliminate projects that do not have high affinity.
  • Identify key services required by these projects.
  • Analyze the candidate asset list.
  • Perform market analysis and technical/cost benefits analysis on the candidate asset list.
  • Identify potential assets to be provisioned in the next quarter.


The context of the  planning effort can be visualized as depicted below.



The hardest part of this process was getting a group to perform an appropriate asset analysis and determine how it ties back to the business and its place in any given market. The flow of this analysis was to determine the appeal to the company, determine the competitive position, establish the market opportunity and then decide if a cost benefits analysis was required.



This resulted in an easy way to visually measure and demonstrate the value of any given asset in the planning process. By creating a graph of attractiveness to the market by competitive position, one can show what assets should be pursued now, in the future, or not considered.



Now the team can quickly review the asset analysis against projects and provide an easy visualization. This can then drive discussions around ROI, necessity, reusability and strategic alinement.  These are all items necessary for planning and gap analysis, an important aspect of the road mapping effort.


Thanks again for stopping by and reading! I welcome and look forward to thoughts and comments.

Sunday, February 28, 2016

EA 872 Week 7 Delivery approaches

Hello again! As always, thank you for reading!

My last couple of blog posts have been focused on different tools or techniques that I have used to establish support, agreement on vision, and key metrics. All of these approaches have been targeted to moving an organization or project to a future vision or state.  This week, readings on "Logical Levels" talked about creating or defining process patterns, viewing information points, and services architectures.  So, as I have done the last couple of weeks, I'm going to reach into my past and discuss another approach that I have successfully used.  We derived a lot of the data that I have written about during a week long Strategy & Planning session. When all of this is pulled together, the other artifacts get us to some of the logical views that are discussed this week. This information happens to be focused on Business Intelligence Solutions. So, it will be biased to reporting and data analysis. But I feel these approaches and techniques can be useful in any EA effort.

We would often utilize a workshop-driven approach to accelerate our client’s knowledge acquisition and planning of the process, organizational change, and capabilities necessary for planning and delivery of key initiatives. This could easily be the Future State we often discuss in EA.  Here is a diagram that better depicts this approach.




If you look back to the last couple of weeks blogs, you will see that there are a series of artifacts created that include SWOT, GQM and Project Complexity. The workshop takes this information and then works to tie this together into more logical views focused in three key areas necessary for development and ongoing execution. These are:


  • Reuse Library: Contains all reusable assets, certified for use in the Application Architecture.
  • Application Architecture Repository:  Software structure for each application of business intelligence & reporting solution, verified for delivery into the Deployment Architecture.
  • Deployment Architecture: Technical structure for all business intelligence & reporting solutions to be designed, developed, and released into a production environment within an enterprise.


We refer to this collection as a "Service Factory Architecture" which can be depicted as such:



I could go on a deeper dive into each of these three areas, but the primary goal with this approach is to drive increasing value for business through a delivery model capable of:

  • Repeatability – Increasing consistency, quality, and reliability for client value
  • Maintainability – minimizing (and ultimately eliminating) critical resource dependencies and constraints
  • Sustainability – managing resource allocation (internal and external) to optimize delivery
  • Reusability - leveraging conventional approaches, software, and tools across multiple solution delivery projects.

This Delivery Framework is  based on the industry principles present in change management and continuous improvement methodologies, and thus its structures relate back to the ‘plan – act – measure’ paradigm.  This begins to tie into how we manage from one state to the next.  To understand more completely, here is a view of the other two fundamental aspects of this framework, organization and process.



As with the Service Factory Architecture, I could go into each of the areas in more detail and may do so at a later date. As we go deeper into each of these areas, what results are artifacts very similar to what was discussed in this week's readings, logical views of the organization and processes.

 I continue to read a lot about frameworks in class, but I sometimes feel that how to deliver within these frameworks is relatively light in detail.  A quick Google search for service delivery frameworks of IT project delivery nets very few results. It would be great to hear other approaches to this.

Thanks again! I look forward to your comments.

Saturday, February 20, 2016

EA 872 Week 6 Low Hanging Fruit

Welcome back and thanks for continuing to read my postings!

These last few weeks we have discussed a lot of information on current views and future state architecture.  One of the focuses has been on the ability to justify decisions based on how it ties back to business strategy.  Last week I talked about some techniques I have used in the past that provided a set of artifacts that could be utilized across project efforts. These SWOT storyboards and GQM matrix also assist with continued buy-in and help the teams tie back to business strategy.  One thing I have found missing from these discussions and readings is "where to start?".  We often hear the term "low hanging fruit" but what is it and how do you find this fruit? This is a question I often had when working with different customers. So my team and I came up with an approach that we found quite useful and delivered a metric we could use to assess the complexity of a proposed effort.

This project complexity effort was the key to project alignment, and solution delivery.  It seems to me that this could also be useful in determining both where to start and the extent of the gaps in getting to a particular future vision. Project Complexity Assessment is the process of determining a specific project’s likely complexity.  It rates a project’s complexity by assessing the project’s anticipated requirements complexity against its anticipated environment complexity. It then goes on to identify the most likely classification by establishing a complexity type for early estimation and planning purposes. This helps ensure effective expectation management from the very early stages of project inception.  This also allows one to find those "low hanging fruits."  This image represents the different levels of complexity.


Once you have determined the different project type, it is then easy to present this in a graph. This demonstrates which complexity type the project best fits and also maps the project’s affinity for both requirements and environment complexity.



There are certainly many articles and books out there about project complexity. This one on Amazon
http://www.amazon.com/Project-Complexity-Assessment-ebook/dp/B00C2HXX58/
is a result of research from the International Center for Complex Project Managment located here
https://iccpm.com/content/complexity-assessment-tool.  To me it appears that a lot of this information is just focused on the project vs how it ties into the overall Business Architecture or EA.

Once again these are just some techniques I have used effectively in the past. I believe they can add value in the EA process by potentially identifying the least complex place to start. In turn, this can offer the greatest chance of success in the shortest amount of time and as a result, add credibility to the EA team.


Please let me know your thoughts.

Thanks for reading!

Sunday, February 14, 2016

EA 872 Week 5 Getting on the same page

Hello again and welcome to my weekly blog. If you are returning, thanks for coming back! If this is your first time, thanks for checking it out!

This week's readings have been about understanding the business context. This topic got me thinking about many of my past experiences and how often this lack of understanding has torpedoed a project or initiative.  As I stated in my week three post, I have often witnessed the need to build a strong team and adopt a shared environment of people, processes, and tools for project success within most of my customers.  This building and adoption process has to start with a common understanding.

I was reading the Gartner document (G00142111) "Building a 'Fast-Path' Common Requirements Vision" and in reading it, I again went back to past experiences. While I do not disagree with the content of the document, I wonder if there is not a pre-process to this CRV effort that can accelerate this type of deliverable and allow the team to get to a common understanding quickly in any conversation?

I have found that two, maybe dated, techniques when combined and tweaked a little can provide an overall understanding that can drive the CRV. These two techniques are a SWOT analysis followed by a Goal, Question, Metric (GQM) exercise. I know SWOT is not pretty, but I have adapted this into an approach that works well. I used colored sticky cards to solicit anonymous responses to each SWOT lane.

  • Strengths      = GREEN
  • Weakness     = YELLOW
  • Opportunity   = BLUE
  • Threats         = RED


From there I arrange the cards on the wall during group discussion for their consideration and feedback. With these flows, I establish a series of Critical Success Factors (CSFs) that evolve into a storyboard. Here is an example:

The next day these CFS storyboards would be laid out on the wall of the meeting room. This visual provides an easy to read view of these CSF and the stimulus for each. A digital version is created to store and distribute. This artifact is then easily recalled when one needs to come back to it.

Now that we have this level of consensus, we would begin GQM sessions.


Per, Basili, V. , et. al. The Goal Question Metric Approach, GQM is
“A technique that is based on the assumption that for an organization to measure in a purposeful way it must first specify the goals for itself and its projects, then it must trace those goals to the data that are intended to define those goals operationally, and finally provide a framework for interpreting the data with respect to the stated goals.”  

The GQM process allows one to link conceptual goals back to the SWOT and then operational level questions to quantitative data or metrics. This is then used to measure success and failure.

I could probably write a blog on just this technique and the success we have had using it, but I want to circle back to my original point. Doing an exercise like this sets a series of high-level strategic goals and objectives that a management or executive team can buy in on. This further helps EA with their future vision planning and migration efforts because it also provides a set of metrics, that are agreed upon, to track goal achievement. I would propose that that one would benefit from having these defined objectives and metrics to work with while building out the CRV.

Let me know what you think or if I am off base here. This is just an approach I have used with successes, and I would like to hear how others have built a consensus in their efforts.

Thursday, February 4, 2016

EA 872 Week 4 It's all about the people!

Hello again!

This week has been a wild one! I am jumping into class team assignments, which are always a little tricky at the start. One has to get organized and establish some level of communication with the team, while never meeting any of them face to face!  For me, this tends to be easy as I telecommute and work with customers/sites remotely a majority of the time.  Given this, I started thinking about how it might affect the other team members.  Are they used to working with remote teams in their working environments or are they more comfortable face to face? My guess is that it is probably a mix and also dependent on the culture of their organization. As I was thinking through this, I thought I might as well write about culture and the impact on EA this week.

To start with, how does one define culture? Or more precisely, how is organizational culture defined? The Business Dictionary states that it is "The values and behaviors that contribute to the unique social and psychological environment of an organization."
Read more: http://www.businessdictionary.com/definition/organizational-culture.html#ixzz3zE2OAIrA

I would say that organizational culture is shared thoughts and actions. It is these values and beliefs that tend to govern how people act in an organization. Given that this is a reflection of the business, there can be subcultures within a working business unit or team or even within a LOB.  Take a minute and think about the different groups within your organization. Do you all have the same goals and value in the workplace? Maybe, but probably not.  Since every organization and team is different, one of the first things we, as EA practitioners, need to identify is the culture and the best way to interact within the cultural boundaries of that group or team.  I am sure you have heard the saying "You catch more flies with honey than vinegar"  Well the same theory applies to starting your EA initiative. If you are considered an outsider, getting anything accomplished is going to be a challenge. The best way to come across as understanding or "one of us" is first to understand the culture of the group you are going to be engaging.  If you do not do this first, then you risk all of the team's EA efforts will be for naught.  Don't get me wrong, I think there is tremendous value in all the artifacts that are produced but remember, EA is about transformation. If the organization, LOB, or team is not "culturally" ready to accept the transformation effort, then the EA effort is headed towards failure.

This may seem like a heavy-handed statement to some of you. However, the Gartner article "Psychology May Hold Key to Successful Enterprise Architecture" from December 2005 by Robert A. Handler illustrates this point. Mr. Handler explicitly states that " Failure to address the human aspects of EA leads to EA failure." The problem is not getting the data, process, or state defined correctly. It is not about anything technical. It is about understanding people, at least, that can become the single biggest point of failure. Addressing the people issues means understanding the culture. When you do that, you will better understand how to be viewed as "one of the team" and that, in my opinion, will direct you towards success in your EA initiative.

Have you ever experienced an issue with culture and your EA efforts? If so how did it affect your initiative or project?

Thanks for stopping by and taking the time to read!

Sunday, January 31, 2016

EA 872 Week 3 Where to Start

Assuming one has acceptance on an EA program, where do you go from there? You hear a lot on this topic. Some of the readings and recordings from this week reflect the "current state vs. future state" debate.  The presentation from Gartner  Research VP R. Scott Bittler "Enterprise Architecture Program Pitfalls- Current State First?" was focused on this topic and very thought provoking.

"Where do we start ?" is kind of like the "chicken or the egg" question. Which came first?  Where you start, in my view, does not matter if you do not have a  process to manage that change.  If our goal, via EA, is to deliver value early and often, we need to provide not only decision-making guidance but a deployment process to support that business decision.

Once again I am going to pull back the covers of my past for some practical experience. In my opinion, some of my biggest disasters I have been a part of are due to the inability of an organization to implement a  new process. As a consultant, I have found that once a process is defined and then agreed upon, it is the implementation process where the battle is won.  A project can rapidly go off track if the organization does not have a process in place to facilitate deployment quickly and efficiently.  We cannot deliver value "often" if the deployment process does not work right. It is like designing an emissions-free vehicle that everyone wants to buy, but you cannot get out the door to the market. No matter how great everyone thinks something is, people lose trust when one is unable to deliver.

In my business, we found we needed to build a strong team, establish guiding principles, and adopt a shared environment (people, process, and tools) when deploying new solutions for our clients. This became more critical than what they were currently doing or how they wanted to evolve their processes and supporting solutions. Once a delivery framework is established, these principles and methods can be applied to individual business unit initiatives and projects across the enterprise. The role of EA is to facilitate the movement from current to future state. That change is managed by adopting a common delivery framework and supporting toolkit. This, in turn, tends to reduce complexity and provides a means to deal with the diversity inherent in most business units. This  framework provides the ability to respond to changing business needs rapidly and efficiently.

So back to the debate...what comes first? Mr. Bittler goes on to talk about doing the current and future state in parallel and the pitfalls of one approach vs. the other.  To manage any of this, one would need to provide change management in a quick and dynamic way that allows for all types of changes. Incremental and radical changes will need to be managed at different times and across business units.

I believe the real starting point is having your delivery framework and management process established. Once you have that, it is "game on" in my opinion!

Thanks again for reading! What are your thoughts and experience?

Thursday, January 21, 2016

EA 872 Week 2 The Effectiveness Challenge

Hello again and welcome to my weekly thoughts on Enterprise Architecture!

There have been several interesting readings this week, not just from the class assignments but also from classmates. I especially appreciate those that found what I wrote to be interesting enough to comment.  Thanks!!

This week I ran through the EA Roadmap Process and also joined the class discussion on the review of the Gartner toolkit for driving strategic EA.  There were quite a few interesting thoughts and comments around this toolkit and how to present to executives, such as how much is too much on a slide and other details around presentation formats.  I joined in that conversation with my thoughts on what or how to present back to an executive team, but I started thinking about effectiveness and the challenge that we all face around initiatives and executive buy-in.  As I was thinking through this, I came back to some of my experience as a C-level executive working through both internal and customer challenges.

What I am about to say might not be taken well, especially with the group that I am a part of, but hear me out.  A lot of the time the challenge when it comes to being effective in a project is IT.  We have all been there - we get pushed by the business, they do not understand what it takes, we do not have the people or budget. If you say "no, not me" take a quick look in the mirror and look yourself in the eye! So when we talk about EA and foundations we should take a look at the challenges with IT and how that can affect what we can do in a business-driven EA approach.

Executives are concerned about their Return on Investment (ROI) from IT. That is no secret. IT expenditures continue to grow, which directly affects the bottom line. Therefore, many executives begin to scrutinize the effectiveness of their IT organizations, and this scrutinization can lead to a bias the minute any new initiative is proposed.   As a result, IT needs to gain credibility within their domains. They need to show that they can be adaptive and fulfill the strategic requirements of the business, in a timely and efficient manner.  If not, then a roadblock to creating an EA foundation is going to exist. If a lack of credibility continues to exist, you may find executives agreeing with the need for EA, but experience shows they will take that initiative to large systems integrators and product and services companies for execution. Remember what I said about IT push back? It is bound to happen when the business hires a firm to tell IT what to do, right?

This idea is not mine, or even new. A quick search will show articles from 10 plus years ago talking about this same subject.  For instance, read this one from CIO.com in 2005...

http://www.cio.com/article/2448774/enterprise-architecture/the-relationship-between-enterprise-architecture-and-it-governance.html

I could go on about this, but my point is that to be effective in building an EA foundation a better balance between the internal IT organization, internal business units and external service and technology providers has to be found.  A foundation can only be built on solid ground.  In most cases in an organization, that has to start with IT.

Friday, January 15, 2016

EA 872 Week 1 Getting Started

This will be my first go at a blog, professional or otherwise, so bear with me and please offer up any feedback you might have!

Throughout my career, I have always straddled the line between IT and business. My hope with enrolling in an EA program is to lend some formalization to aspects of my work that I do or have done.  With that said I hope you enjoy what I post and look forward to any comments you might have.

This week's focus has been on some of the core concepts that are necessary to both understanding EA and applying it within an organization.  Readings this week focused on building foundations for excellence along with activity cycles for categorizing the type of work effort that goes into building and maintaining these foundations.  One of the other items that was discussed is context, in fact, context is one of the first and most important deliverables in the EA process.

As I read this information, I thought back on some of my consulting work in the mid-2000's. During that time, I was part of a small BI company. We worked in quite a few Fortune 50 companies, and one thing that we consistently observed was that key metrics were often used, defined and reported on differently across groups, teams, and executives.  This often resulted in projects and deliverable delays when delivering BI projects that spanned several lines of business.   Try providing an Executive Level Dashboard or Scorecard when those users cannot agree on a metrics use or definition. The deliverable or project is always going to be wrong to someone in some LOB! This is not going to achieve the value add to the business nor the project ROI. Not to mention how successful it is perceived to be!

Going back through lessons learned it became apparent that this was a common thread across customers and organizations. To better execute we developed a series of workshops and processes to establish context around the project and across the organization.  We offered this as a Solution
Assessment service that focused on creating "Context" for the solution and the associated efforts. We called this approach our Service Factory.  This approach requires a shift in the processes and organization to achieve a business-oriented model-driven process to recoup investments in its infrastructure and potentially create an ‘information on demand’ environment. Our approach was to provide the new procedures and practices to conduct:
  • Project initiation (selecting the right projects for using the infrastructure-enabled value);
  • Project enactment (executing the projects in an optimal manner);
  • Project support (efficient use of tools and solutions infrastructure provided as service through the Center of Excellence).
All of these efforts provided a common understanding or context around the project initiatives and its future state or end goal. This gave the initiative a common ground from which to build a foundation.  Granted this was and is very project or system focused approach but I see parallels and value here to EA.

This linkage from Systems Engineering and EA is discussed in some detail here, and I found it to be a great read, maybe because it supported my thought process!!

https://ingenia.wordpress.com/2015/03/20/enterprise-architecture-and-systems-thinking-ian-glossop/

As I continue in my EA journey, I am beginning to see how some of my previous experience is relating to these fundamental concepts being laid out here at the start of the program.  That is both exciting and reassuring from my perspective and tells me that I am on the right path.

I hope you have enjoyed my ramblings for the week. Until next week, please let me know if you have any thoughts or comments!

Thanks for reading!