Skip to main content

A DD4T.net Implementation - Resolving Partial Models

In the previous post The Model Factory, I mentioned a couple of methods on the ModelFactory class that can be used to resolve partial models.

But first, what are partial models? The way DD4T publishes linked Components depends on a parameter LinkLevels on the TBB Generate dynamic component. This parameter indicates the number of levels to follow Component Links and to generate additional models for them. Its default value 1 means that only one level of Component Links will be followed starting with the current Component that is being published and this linked Component will only be published as a collection of metadata, while the Fields and MetadataFields collections are empty. These are partial models, because they contain only some information about the Component.

When to use higher values for LinkLevels and when not? If we increase the LinkLevels, the original Component model will contain models for all linked Components. This is very convenient when hopping from one Component to the other, but it creates a few problems:
  • the size of the original model grows exponentially
  • consistency across models is not maintained (same model can be accessed via different routes and one must publish all linking Components in order to keep the linked Component models consistent)
The other option would be to keep LinkLevels value to 1. This means, on the other hand, that in order to load the linked Component model we must:
  • publish it as a Dynamic Component Presentation (with its own dynamic CT)
  • load it on its own using the ComponentFactory and/or ModelFactory
Of course, different situations demand different solutions. I would choose LinkLevel = 1 for content models that are highly linked and where the same Component is linked to via multiple routes. This is also the situation I am presenting in this post regarding the ResolveModel methods in the ModelFactory.

The ResolveModel methods are to be used when we need the whole complete model of the linked Component and what we have initially is only a partial model. The idea is to pass the partial model along with a view name or template TcmUri and the ModelFactory should return a fully created strong typed model.

The ResolveModel methods come in two flavours -- one taking only a model as parameter and resolve it, the other taking an IList of models and resolving each partial model.

    public T ResolveModel<T>(T model, string viewOrTemplateUri) where T : ModelBase
    {
        return GetModel<T>(model.Id, viewOrTemplateUri);
    }

    public IList<T> ResolveModels<T>(IList<T> models, string viewOrTemplateUri) where T : ModelBase
    {
        return models.Select(x => ResolveModel<T>(x, viewOrTemplateUri)).Where(x => x != null).ToList<T>();
    }

The implementations are quite trivial, as seen above, but they make life a lot easier when we need to resolve a single or a collection of partial models:

    IModelFactory modelFactory = DependencyResolver.Current.GetService<IModelFactory>();
    device.Metadata.RelatedAssets = modelFactory.ResolveModels(device.Metadata.RelatedAssets, "Asset");

In the example above we have a collection of partially resolved Asset models linked from the metadata of a Device model.

First, the code calls the ResolveModels method passing in the collection of partially resolved Asset models and a view name.

Next, I replace the collection of partially resolved models on the Device metadata with the collection of fully resolved Asset models.



Comments

Popular posts from this blog

Scaling Policies

This post is part of a bigger topic Autoscaling Publishers in AWS . In a previous post we talked about the Auto Scaling Groups , but we didn't go into details on the Scaling Policies. This is the purpose of this blog post. As defined earlier, the Scaling Policies define the rules according to which the group size is increased or decreased. These rules are based on instance metrics (e.g. CPU), CloudWatch custom metrics, or even CloudWatch alarms and their states and values. We defined a Scaling Policy with Steps, called 'increase_group_size', which is triggered first by the CloudWatch Alarm 'Publish_Alarm' defined earlier. Also depending on the size of the monitored CloudWatch custom metric 'Waiting for Publish', the Scaling Policy with Steps can add a difference number of instances to the group. The scaling policy sets the number of instances in group to 1 if there are between 1000 and 2000 items Waiting for Publish in the queue. It also sets the

Toolkit - Dynamic Content Queries

This post if part of a series about the  File System Toolkit  - a custom content delivery API for SDL Tridion. This post presents the Dynamic Content Query capability. The requirements for the Toolkit API are that it should be able to provide CustomMeta queries, pagination, and sorting -- all on the file system, without the use third party tools (database, search engines, indexers, etc). Therefore I had to implement a simple database engine and indexer -- which is described in more detail in post Writing My Own Database Engine . The querying logic does not make use of cache. This means the query logic is executed every time. When models are requested, the models are however retrieved using the ModelFactory and those are cached. Query Class This is the main class for dynamic content queries. It is the entry point into the execution logic of a query. The class takes as parameter a Criterion (presented below) which triggers the execution of query in all sub-criteria of a Criterio

Running sp_updatestats on AWS RDS database

Part of the maintenance tasks that I perform on a MSSQL Content Manager database is to run stored procedure sp_updatestats . exec sp_updatestats However, that is not supported on an AWS RDS instance. The error message below indicates that only the sa  account can perform this: Msg 15247 , Level 16 , State 1 , Procedure sp_updatestats, Line 15 [Batch Start Line 0 ] User does not have permission to perform this action. Instead there are several posts that suggest using UPDATE STATISTICS instead: https://dba.stackexchange.com/questions/145982/sp-updatestats-vs-update-statistics I stumbled upon the following post from 2008 (!!!), https://social.msdn.microsoft.com/Forums/sqlserver/en-US/186e3db0-fe37-4c31-b017-8e7c24d19697/spupdatestats-fails-to-run-with-permission-error-under-dbopriveleged-user , which describes a way to wrap the call to sp_updatestats and execute it under a different user: create procedure dbo.sp_updstats with execute as 'dbo' as