Wednesday, April 27, 2016

EA Week 15 Final thoughts

As always, hello! Welcome back, if you are returning! Thanks for reading, if this is your first time!

This posting is my final blog of the semester, but I have to say I have really enjoyed it! I want to thank anyone that has commented.  I really value the thoughts, ideas, and feedback that I have received.

As a final topic, I want to discuss the execution of metrics and tracking performance.  I had several comments around metrics, the value of performance tracking, and over commitment.  One classmate noted that one approach could use a "value-to-effort assessment for each metric. How will the collection of each metric increase workload?" You can find that discussion here




He was quite right. If we don't understand how deploying these new metrics will impact the data collection or the other efforts associated with IT and business operations, one could create a problem before getting started.  I feel that some of this would get flushed out during the project complexity assessment.  You can find more detail on project complexity here:

Yet, I still wanted to dive deeper on this. One way to understand the effort for a metric is to translate that into code and see where it falls apart.

For simplicity let's agree that a metric can be defined and a conceptual, logical representation reflected as such:




To better highlight this concept, take a look at this example:


This image highlights several items to consider. It takes both business and IT to produce a metric, much less EA performance metrics. A metric definition can be translated into code, at least pseudocode, very quickly in order to evaluate complexity.


As always, I welcome your comments and thoughts. It has been great working with everyone this semester and I especially thank all who have taken the time to read and comment!


Saturday, April 23, 2016

EA 873 Week 14 The Final Countdown

As always, hello! Welcome back, if you are returning! Thanks for reading, if this is your first time!

Over the course of the semester, I have been discussing different approaches to what amounts to artifact creation. For the last two postings, I thought I would focus on the next step in the cycle and discuss solution delivery. In week 3 I briefly touch on some of these items, but it is worth highlighting that again for this discussion.


In my business, we found we needed to build a strong team, establish guiding principles, and adopt a shared environment (people, process, and tools) when deploying new solutions for our clients. This need became more critical than what they were currently doing or how they wanted to evolve their processes and supporting solutions. Once a delivery framework is established, these principles and methods can be applied to individual business unit initiatives and projects across the enterprise. The role of EA is to facilitate the movement from current to future state. That change is managed by adopting a common delivery framework and supporting toolkit. This adoption, in turn, tends to reduce complexity and provides a means to deal with the diversity inherent in most business units. This framework should provide the ability to respond to changing business needs rapidly and efficiently.

In essence, the delivery process should facilitate the ability to


  • Understanding the objective and key initiatives 
  • Align those key initiatives and objectives
  • Execute activities 
  • Evaluate output for performance and ongoing improvement. 


Given this let's take the rollout of a CRM system as an example.
The following picture depicts this rollout on a swimlane. As you can see the swim lanes show participation from functional groups the last two are the continuous improvement/refinement steps.



Looking at the diagram, I hope it becomes clear why the artifacts discussed in my previous posts become so critical to the actual delivery process, look at the touchpoints across the organization.  Even at this high-level one can begin to see the challenges if, for instance, we did not have our SWOT storyboard to refer the group back to during this process.

WEEK 11

What about in the improvement/refinement steps? What if we lacked the GQM output I discussed last week?

WEEK 13


The Gartner article  "Advancing the Common Requirements Vision Deliverable" Retrieved April 23, 2016, from
https://online.ist.psu.edu/sites/ea872fusco/files/advancing_the_common_requirements.pdf

directly speaks of the need for an anchor model and using the business requirements to drive common solutions. It also talks about identifying core value adding processes improvement which tied to the improvement refinement step notated in the swimlane.

As I have mentioned the last couple of weeks,  I hope this starts to tie together my past posts into a more useful and bigger picture.

As always your comments are welcomed! I look forward to your thoughts on how you have seen project and solution delivery occur in your career.

Monday, April 18, 2016

EA 873 Week 13 Tracking and Measuring Performance


As always, hello! Welcome back, if you are returning! Thanks for reading, if this is your first time! I am a little late getting my post done this week  Life got in the way, but I guess that is what this is all about...balancing the two!


I was going over some readings this week and I was particularity intrigued with the Gartner presentation on Developing High-Impact EA Performance Metrics. This can be viewed here:

 http://www.gartner.com/document/1413013

The presentation touched on why we lose our way when it comes to strategy execution.  I have seen this in many forms and if there is one theme about my post this semester it has been that execution is crucial but only achievable with consensus and direction.  The article goes on to discuss that EA efforts should be guided by this strategic direction, becoming, essentially, the bridge between strategy and execution.  To act as this bridge, we must measure EA performance. This measurement should be through the use of metrics, giving EA and management the ability to track the overall performance of the EA efforts.

I can not agree more.  Yet in my experience, this breaks down because most organizations have difficulty defining, building and managing core metrics.  I talk about the process a little in my week five post here:

Week 5 Blog Post


I touch on the metric discussion briefly near the end and on utilizing Goal, Question, Metric or GQM.

To restate

Per, Basili, V. , et. al. The Goal Question Metric Approach, GQM is

“A technique that is based on the assumption that for an organization to measure in a purposeful way it must first specify the goals for itself and its projects, then it must trace those goals to the data that are intended to define those goals operationally, and finally provide a framework for interpreting the data with respect to the stated goals.”  

http://www.cs.umd.edu/~mvz/handouts/gqm.pdf

In other words, GQM is a hierarchical structure starting with a goal which specifies the purpose of measurement, object to be measured, issue to be measured, and viewpoint from which the measure is taken.

There are three levels to GQM.

Level 1 Conceptual (GOAL) defined for an object from various points of view relative to a particular environment:


  • Products: Artifacts, deliverables, and documents that are produced during the system life cycle; e.g. models, components, test suites.
  • Processes: Software related activities commonly associated with time; e.g. modeling, designing, testing.
  • Resources: Items used by the processes to produce their outputs; e.g. personnel, hardware, software, office space.


Level 2 Operational level (QUESTION) A set of questions used to characterize the way the assessment/achievement of a specific goal is going to be performed based on some characterizing model (the RBM).
These questions try to describe the object of the measurement (product, process, resource) on a selected quality issue and to determine its quality from a selected viewpoint.

Level 3 Quantitative level (METRIC)  Data associated with every question to answer it in a quantitative way.  The data can be:


  • Objective:  If they depend only on the object being measured and not on the viewpoint from which it is taken; e.g., number of versions of a document, staff hours spent on a task, size of a component.
  • Subjective:  If they depend on both the object that is being measured and the viewpoint from which they are taken; e.g., readability of a text, level of user satisfaction.


Here is a depiction of that structure that better represents the hierarchical structure



Now let us look at that in an example. I have tried to use something that can relate to project level work such as change request turn around time. Working from a current state to future state one of the things we identified in a SWOT is the ability to be reactive. As the business changes, so should the processes and improvements the EA efforts are overseeing.  Thus, we want to track improvements from the viewpoint of the project manager.




This provides a very measurable approach to the metric that everyone can agree to, measure and report the change over time.

Let me know what you think or if I am off base here. I would like to hear how others have built a consensus in their efforts around performance and tracking EA efforts.

Thanks again for reading!

Sunday, April 10, 2016

EA 873 Week 12 The Effectiveness Challenge

As always, hello! Welcome back, if you are returning! Thanks for reading, if this is your first time!

This week our readings were focused on governance. Governing, according to Gartner, “refers to the processes and organizational structure, along with their associated input and decision rights, that guide desirable enterprise behavior"  Reading went on to discuss touchpoints for EA and governance processes.  Artifacts and the management of artifacts becomes a critical part of the EA governance process.  As I have been doing the last couple of weeks, I wanted to look back into my personal experience for some context and insight that relates to the EA topic we are discussing this week.

Artifacts and governance remind me of a concept called Reuse-driven Software Engineering.  I have used this process as part of Center of Excellence (COE) efforts around Business Intelligence and Analytics.

Reuse can be thought of as a set of actions that maximizes the use of assets, processes and artifacts from one line of business (LOB) for the development and deployment of solutions to another LOB.  This approach seems simple conceptually, but it can be very complex and challenging to implement.  This implementation is where COE and EA governance efforts come into play.  Organizational models, new architectures, and processes will need to be introduced and managed/governed if reuse is to be effective.  When implemented effectively a continuous improvement oriented solution delivery model can be deployed as depicted below.




This delivery model can be characterized by being:

  • Customer-driven: Collaboratively developed with the business and aligned with IT.
  • High Value: Provided impact to the business that is aligned with goals and objectives.


In a COE scenario, this approach would utilize the prioritized business models as a primary means of governing and alignment capabilities. These capabilities, in turn, drive the solutions required by the business units.  In my opinion, this is how the EA team should approach governance of artifacts in the organization.


Here is a link related to this subject, while dated (2009), it suggests that maybe there is a connection between EA and Reuse.

http://www.architectingforum.org/whitepapers/SAF_WhitePaper_2009_9.pdf

Do you also see the COE management model as viable for EA artifact governance? As always I welcome your thoughts and comments!


Thanks again for reading. Again, I hope this starts to tie together my past posts into a more useful framework




Sunday, April 3, 2016

EA 873 Week 11 The Foundation for Execution

Hello and welcome back, if you are returning! Thanks for reading, if this is your first time!

A lot of this semester's focus has been on the different aspects of an EA program or initiative. In this blog, I am going to pull together some elements and discuss assessing and organizations foundations for execution. A lot of these thoughts have been a result of reading the book assigned to us for EA 872,

"Enterprise Architecture As Strategy" by Ross, Weill, and Robertson. 

The book discusses many warning signs of a company that doesn't have a foundation to support its strategy.  A foundation is critical to developing an increased effectiveness of Marketing, Sales & Service Operations from the perspective of the enterprise. It should provide for increased standardization within and across all operational centers from the standpoint of the leadership team. It must also improve the integration of strategic objectives from the Stakeholders’ perspective.

I've attempted to show this by utilizing one of the techniques discussed in my week five post, SWOT with storyboards;

http://joec3s.blogspot.com/2016/02/ea-872-week-5-getting-on-same-page.html

The SWOT analysis will show that Enterprise commitment to EA is critical for successful deployment and acceptance, thus creating a framework for execution.  We can then follow the issues that arise when looking to establish enterprise level programs like EA.  To recap, colors used below map as follows:

  • Strengths      = GREEN
  • Weakness     = YELLOW
  • Opportunity   = BLUE
  • Threats         = RED


EA Programs start with enterprise commitment

STRENGTH MUST start from the top with Executive ownership and Leadership.



EA Programs Promise Business Value Realization

This leads to OPPORTUNITIES THAT CAN  remove Silos ( Organization, Data, and Processes) which allows one to stay a step ahead and improve Competitive Positioning  and  Customer Service.


Obstacles to Success Exist within Enterprises

Major WEAKNESSES such as too much data/information or the incorrect type of information necessary to respond quickly and efficiently in a rapidly changing business environment typically sabotage opportunities.


The Price of Failure is High

The Price of failure is apparent. These THREATS lead to loss of revenue, decreased profitability and eventual loss of customers.


You can overcome

Determining  ‘the Right’ foundation for your EA program can mitigate these risks and help it succeed.



However, to do so, the foundation must take into account PROCESS, ORGANIZATION and ARCHITECTURE.


Thanks again for reading. Again, I hope this starts to tie together my past posts into a more useful framework.