Creating an Architecture Review
Architecting even the simplest of systems nowadays can be a significant and complex undertaking and therefore a review of the current/proposed architecture is generally warranted. I’ve often been asked “What does a review of this type consist of?” and thought it would be beneficial to share some key area’s that I normally focus in on.
The architecture fundamentals dictate how maintainable and extensible a system is, and provides an indication of how well it integrates with other systems. Emphasis should be placed on determining whether or not the architecture is layered, allowing for separation of responsibilities as well as using established design patterns to solve common problems.
Performance and Scalability
These are area’s to assess whether the architecture is designed for performance and can be scaled out. For instance, does the architecture allow the system to be scaled accordingly (i.e. introduction of additional hardware), response times meet SLA agreements, etc.
There are three layers of security: network, host and application. Using those as a basis, how is security incorporated into the system? Threat modeling, penetration and security testing should also be included to verify how secure the system is and what (if any) gaps exist. Although testing which by its very nature is specific to development and may be out of scope for this type of review, I’d argue that it should be considered when architecting a solution. A byproduct of including these types of tests are follow-up’s for code reviews which could then lead to a more robust system.
Instrumentation and Monitoring
Instrumentation allows operations to monitor a system in order to keep it running smoothly in production. Custom and/or standard WMI events (such as performance counters) should be implemented for monitoring performance and stability.
Logging is the act of writing information about system events to some data repository. Thus, logging patterns should be established and well documented. This is not the same as auditing.
Auditing is particularly important to many institutions as they may be subject to regulatory requirements. Auditing allows the ability to track all user activity (who did what, when) and should almost always be part of the architecture.
When errors occur, it is critical that they are handled in a consistent and robust manner. Therefore, there is an exception handling approach, it’s documented and consistently applied.
Data access is the cause of many performance problems and close attention should be paid to this layer. For many organizations, security and reliability of data access is also paramount. Consequently the data access layer should use standard technologies, frameworks and APIs.
Third Party Products
In an age where there’s almost a component built for every need, off-the-shelf products are being used in the architecture where practicable (i.e. rather than custom pieces being developed). Two worthy questions to ask: Does the system make good use of off-the-shelf products and does it use those products in such a way that they can be supported in a production environment?
This section deals with how will the system evolve with time and will it be able to take advantage of future advances in technology? Basically, the system should have a clearly defined roadmap covering future major releases.
Other area’s to consider would be diagraming (e.g. conceptual, system context, sequence and activity, etc.), accurate and comprehensible documentation, release/build process and deployment and environment configuration. Once all of these assessment areas have been defined and documented, creating a color coded dashboard using a fixed scale, say, 1-5 that explicitly shows which area's need improvement is extremely helpful.
I’ve only skimmed the surface of what an architecture review should encompass so I highly recommend taking a look at these two predefined methods which give greater insight into creation and process:
- Architecture Tradeoff Analysis Method (ATAM)
- Lightweight Architecture Alternative Assessment (LAAAM)
The first method was developed by the Software Engineering Institute at the Carnegie Mellon University and the second is Microsoft’s approach based on the former.