Skip to main content

Experience Manager in DD4T with Service-Oriented Architecture Providers

This post describes the architecture for Tridion's Experience Manager (XPM) when used with an MVC framework such as DD4T in a Service-Oriented Architecture. This architecture implies that the website is completely independent of Tridion Content Delivery APIs. The only place where the CD APIs are used is on the Provider's Services Layer, as described in the following diagram.
I am going to focus on this diagram to explain the basics of XPM and Session Preview mechanism in a DD4T with SOA providers setup.

Publishing Flow

This flow is a standard Content Delivery DB publishing flow: TRIDION CONTENT MANAGER publishes Pages, Components and Binaries to a DEPLOYER that writes everything in the CONTENT DELIVERY DATA STORE.

It is very important to emphasize here that all publishing types go to the CD DB and that nothing is written to a file-system.





Content Delivery Flow

This flow represents the normal delivery of pages and binaries from the web-application DD4T WEBSITE to the visitor's browser.

The visitor's request to DD4T WEBSITE goes through the MVC controller into a model factory, which invokes a provider on the PROVIDER SERVICES LAYER using a RESTful call.

The provider reads the content from the CONTENT DELIVERY DATA STORE and returns it as string (or some serialized format) to the factory on the DD4T WEBSITE.

Here the response is deserialized into an MVC model (e.g. a Page meta object). Further, the model is forwarded to a View, which renders and sends the response to the visitor's browser.

Experience Manager Flow

This flow describes the high-level actions occurring when a Page or Dynamic Component Presentation is edited through XPM. This flow succeeds the Content Delivery Flow, when an editor enables XPM on a Page in the Staging DD4T WEBSITE.

TRIDION CONTENT MANAGER loads the page in an Iframe through its XPM module running as an extension on the CME website.

When the editor makes changes to the content (adding, changing or reordering items), they are saved immediately to their respective Tridion items on the TCM. However, the changed items are not published immediately -- they only get published when the editor clicks the button Finish Editing in the XPM toolbar. Instead, these changes are 'fast-track published' (i.e. posted) to an ODATA WEBSERVICE, which writes them into a SESSION PREVIEW DATA STORE.

Session Preview Flow

This flow succeeds the Experience Manager Flow and it consists of the high-level actions occurring when the editor clicks the 'Update Preview' toolbar button in XPM. This update is required when the content the user is seeing in an XPM session has been modified but not published. XPM detects that the last modified timestamp of the content is newer than the last published timestamp, and as such an update is needed.

Instead of the editor publishing the modified items, they just click 'Update Preview'.

The XPM module on the TRIDION CONTENT MANAGER sets a cookie on the edited page containing the id of the current XPM session (i.e. preview-session-token) and then triggers the reload of the edited page.

The Content Delivery Flow occurs with the DD4T WEBSITE invoking the PROVIDER SERVICES LAYER requesting a serialized model. In this request, the cookie preview-session-token needs to be present.

The provider uses the preview session token to look up the content in either the CONTENT DELIVERY DATA STORE or in SESSION PREVIEW DATA STORE. If modified content is available in the Session Preview DB, it is returned. Otherwise, it returns published content from Content Delivery DB.

Problem with Session Preview Flow

Because the current approach is MVC and SOA providers, the DD4T WEBSITE lacks all knowledge of Tridion. This means there is no Tridion software running on it. In a non-SOA approach, we would run the Ambient Data Framework on the web-application, and ADF would provide the session preview token to the underlying CD API. Based on this token, the API 'knows' which content to look for.

XPM sets the cookie with the preview session token on the edited page, but there is no software module that uses it or makes it available to PROVIDER SERVICES LAYER.

The solution consists in us writing the logic to attach the preview session token to the RESTful request to the services layer. As such, the factories are responsible to attach a cookie to their RESTful request containing the preview session token.

Once this token is in the RESTful request, the ADF module running on the PROVIDER SERVICES LAYER is able to intercept it and use it during the content lookup.

Conclusion

XPM with DD4T using SOA providers is possible and it works, but not OOTB. Some minimalistic code is needed to send the preview session token to the providers.


Comments

Popular posts from this blog

Running sp_updatestats on AWS RDS database

Part of the maintenance tasks that I perform on a MSSQL Content Manager database is to run stored procedure sp_updatestats . exec sp_updatestats However, that is not supported on an AWS RDS instance. The error message below indicates that only the sa  account can perform this: Msg 15247 , Level 16 , State 1 , Procedure sp_updatestats, Line 15 [Batch Start Line 0 ] User does not have permission to perform this action. Instead there are several posts that suggest using UPDATE STATISTICS instead: https://dba.stackexchange.com/questions/145982/sp-updatestats-vs-update-statistics I stumbled upon the following post from 2008 (!!!), https://social.msdn.microsoft.com/Forums/sqlserver/en-US/186e3db0-fe37-4c31-b017-8e7c24d19697/spupdatestats-fails-to-run-with-permission-error-under-dbopriveleged-user , which describes a way to wrap the call to sp_updatestats and execute it under a different user: create procedure dbo.sp_updstats with execute as 'dbo' as

Content Delivery Monitoring in AWS with CloudWatch

This post describes a way of monitoring a Tridion 9 combined Deployer by sending the health checks into a custom metric in CloudWatch in AWS. The same approach can also be used for other Content Delivery services. Once the metric is available in CloudWatch, we can create alarms in case the service errors out or becomes unresponsive. The overall architecture is as follows: Content Delivery service sends heartbeat (or exposes HTTP endpoint) for monitoring Monitoring Agent checks heartbeat (or HTTP health check) regularly and stores health state AWS lambda function: runs regularly reads the health state from Monitoring Agent pushes custom metrics into CloudWatch I am running the Deployer ( installation docs ) and Monitoring Agent ( installation docs ) on a t2.medium EC2 instance running CentOS on which I also installed the Systems Manager Agent (SSM Agent) ( installation docs ). In my case I have a combined Deployer that I want to monitor. This consists of an Endpoint and a

Debugging a Tridion 2011 Event System

OK, so you wrote your Tridion Event System. Now it's time to debug it. I know this is a hypothetical situtation -- your code never needs any kind of debugging ;) but indulge me... Recently, Alvin Reyes ( @nivlong ) blogged about being difficult to know how exactly to debug a Tridion Event System. More exactly, the question was " What process do I attach to for debugging even system code? ". Unfortunately, there is no simple or generic answer for it. Different events are fired by different Tridion CM modules. These modules run as different programs (or services) or run inside other programs (e.g. IIS). This means that you will need to monitor (or debug) different processes, based on which events your code handles. So the usual suspects are: dllhost.exe (or dllhost3g.exe ) - running as the MTSUser is the SDL Tridion Content Manager COM+ application and it fires events on generic TOM objects (e.g. events based on Tridion.ContentManager.Extensibility.Events.CrudEven