Skip to main content


Showing posts from June, 2014

Experience Manager in DD4T with Service-Oriented Architecture Providers

This post describes the architecture for Tridion's Experience Manager (XPM) when used with an MVC framework such as DD4T in a Service-Oriented Architecture. This architecture implies that the website is completely independent of Tridion Content Delivery APIs. The only place where the CD APIs are used is on the Provider's Services Layer, as described in the following diagram. I am going to focus on this diagram to explain the basics of XPM and Session Preview mechanism in a DD4T with SOA providers setup. Publishing Flow This flow is a standard Content Delivery DB publishing flow: TRIDION CONTENT MANAGER publishes Pages, Components and Binaries to a DEPLOYER that writes everything in the CONTENT DELIVERY DATA STORE . It is very important to emphasize here that all  publishing types go to the CD DB and that nothing is written to a file-system. Content Delivery Flow This flow represents the normal delivery of pages and binaries from the web-application DD4

OData Page Provider for DD4T Java

This little project has been on my mind for a while now: write a set of DD4T providers for OData (aka the Content Delivery Web Service). The approach is to remove all mentions of the CD API from the DD4T application itself and restrict Tridion interaction to the OData service. The DD4T application would use an OData client to talk with the service. Since communication with OData goes over HTTP, the number of queries to the service must be kept to a minimum, so the DD4T PageFactory will make heavy use of caching. I implemented an EHCache provider for DD4T, but that's not the scope of this blog post. So finally, this weekend I set on to writing these providers. The first challenge quickly showed up; the CD API is used in many DD4T classes and in the cases where the actual functionality is not used directly, DD4T still makes extensive use of CD API classes and interfaces. It took me a couple of hours to remove the mentions to com.tridion.* packages. During this process, t

Retrieve Classified Items for Multiple Keywords

Using the standard Content Delivery API, it is possible to retrieve related content (i.e. items that are classified against Keywords) only for one Keyword at a time. If you find yourself with a requirement to read classified items for all  Keywords in a Taxonomy, then you'll quickly realize that approach is a performance killer. To better express what I'm looking for is the following: Keyword A -- classified with Component 1, Component 2     Keyword B -- classified with Component 3, Page 1     Keyword C -- classified with Component 1, Page 2 I want to read the classified items for all Keywords in a Taxonomy and still maintain the 'classified' relationship between each Keyword and its corresponding list of Components & Pages. The way I did it was to write my own Spring/Hibernate query and piggy-back on the existing DAO objects and methods available in the CD Storage API. The key is the bean , which basically maps the re

Yet Another Event System - Revisited

As a followup of my previous post Yet Another Event System? , it is time for a follow up on the code and examples. The code has been added to Google Code and it's available for free at  YAES source code . You can simply download the entire solution and start coding. If you're only interested in the main functionality class, go directly to ConfigurationManager.cs . Installation This is a normal Event System installation process: Copy the compiled Event System DLL to a location of your choosing Reference the DLL from file [Tridion_Home] \conf\Tridion.ContentManager.config , in section configuration / extensions / add , attribute assemblyFileName Copy a .config file to the same directory where your Event System DLL is located, and name it the same as your DLL, but with extension .config Restart Tridion processes. For knowing exactly which process to restart, have a look at my earlier post  Debugging a Tridion 2011 Event System Configuration The .config file alo

My Own 'Core Service'-based Purge Publishing Queue

After blogging about my beautiful script on purging old Publish Transaction from the Publishing Queue, it turned out the PurgeQueue.exe tool provided by SDL with SDLTridion 2011 simply did not work. More exactly, I was attempting to delete Publish Transactions with any status (Success or Failed) from the Publishing Queue, that had a date earlier than one year ago today. PurgeQueue.exe runs and displays a message that the Publishing Queue has been purged. In fact the Publish Transactions were not deleted and still appeared in the Publishing Queue. So it was time to write my own custom PurgeTool, using the Core Service .NET API. The requirement stays the same -- delete Publish Transactions that are at least a year old. I started with the programmatic instantiation of the the Core Service client, as described in my earlier post Core Service Client Sample Code . Once I had a SessionAwareCoreServiceClient   client object the rest of the code is trivial: PublishTransactionsFilter