Skip to main content

Create Multimedia Component using Core Service Java Client

I recently stumbled upon the following question on TREX: http://tridion.stackexchange.com/questions/1379/not-able-to-upload-binary-images-using-core-service2011-api-from-java-client/

I had never actually attempted to upload a Binary and create a Multimedia Component using the Core Service. So I was intrigued how to achieve that.

The solution is pretty straightforward, but it does contain two steps:
  • First, upload the Binary to the TCM;
  • Second, create the Multimedia Component referencing the path on TCM where the binary had been uploaded;
The sample code below accomplishes the first step:

private static File uploadFile(File file) {
    try {
        byte[] fileData = new byte[(int) file.length()];
        DataInputStream dis = new DataInputStream(new FileInputStream(file));
        dis.readFully(fileData);
        dis.close();
        IStreamUpload clientUpload =
                CoreServiceFactory.getStreamUploadBasicHttpClientClient();
        String uploadFile = clientUpload.uploadBinaryByteArray(file.getName(), fileData);
        return new File(uploadFile);
    } catch (Exception e) {
        e.printStackTrace();
    }
    return null;
}

The method takes a local file (i.e. local on the client machine) and uploads it to the TCM using the Core Service StreamUploadBasicHttpClient endpoint. It returns a File object representing the location of the uploaded binary on the TCM (e.g. C:\Windows\TEMP\tmp2432.jpg).

Once we have the binary uploaded to the remote TCM, we can now create the Multimedia Component:

    ICoreService client = CoreServiceFactory.getBasicHttpClient();

    ReadOptions readOptions = new ReadOptions();
    ComponentData componentData = (ComponentData) client.getDefaultData(ItemType.COMPONENT, "tcm:1-1-2"readOptions);
    componentData.setTitle("MMC " + System.currentTimeMillis());

    LinkToSchemaData linkToSchema = new LinkToSchemaData();
    linkToSchema.setIdRef("tcm:1-26-8");
    componentData.setSchema(linkToSchema);

    File localFile = new File("C:\\Beagle.jpg");
    File remoteFile = uploadFile(localFile);
    BinaryContentData binaryContentData = new BinaryContentData();
    binaryContentData.setFilename(localFile.getName());
    binaryContentData.setUploadFromFile(remoteFile.getAbsolutePath());

    LinkToMultimediaTypeData linkToMultimediaType = new LinkToMultimediaTypeData();
    linkToMultimediaType.setIdRef("tcm:0-2-65544");
    binaryContentData.setMultimediaType(linkToMultimediaType);
    componentData.setBinaryContent(binaryContentData);

    componentData = (ComponentData) client.save(componentData, readOptions);
    client.checkIn(componentData.getId(), true, null, readOptions);
}

The code is pretty straight forward:
  • initial ComponentData is created using client.getDefaultData();
  • Multimedia Schema is linked from ComponentData;
  • BinaryContentData is created using the remote binary file;
  • MultimediaType is identified and specified. This is probably the ugliest part of the entire appraoch. This logic could be written by first reading all available Multimedia Types, then identifying the one we need by matching on the binary mime type;

For more information about the CoreServiceFactory object, have a look at me previous posts relating to Java Core Service clients.

Note that the default Reader Quota for my Java client was 16384 bytes, and I got the following error message:

com.sun.xml.ws.fault.ServerSOAPFaultException: Client received SOAP Fault from server: The formatter threw an exception while trying to deserialize the message: Error in deserializing body of request message for operation 'UploadBinaryByteArray'. The maximum array length quota (16384) has been exceeded while reading XML data. This quota may be increased by changing the MaxArrayLength property on the XmlDictionaryReaderQuotas object used when creating the XML reader. Line 1, position 615.

The solution was to increase the MaxArrayLength reader quota on the WCF service by editing the file [TRIDION_HOME]\webservices\Web.config and to insert the following line below system.ServiceModel / bindings / basicHttpBinding / binding name="StreamUpload_basicHttpBinding":

<readerQuotas maxArrayLength="999999999" />


Comments

Jeremy Rambo said…
Did you ever figure out a way to overcome the quota issue without changing the WCF service? I.E. Can I configure the Java Client in any way that allows me to configure the maxArrayLength?
D .Deepak said…
hi..How Can we update the existing binary/multimdeia component?

Popular posts from this blog

Scaling Policies

This post is part of a bigger topic Autoscaling Publishers in AWS . In a previous post we talked about the Auto Scaling Groups , but we didn't go into details on the Scaling Policies. This is the purpose of this blog post. As defined earlier, the Scaling Policies define the rules according to which the group size is increased or decreased. These rules are based on instance metrics (e.g. CPU), CloudWatch custom metrics, or even CloudWatch alarms and their states and values. We defined a Scaling Policy with Steps, called 'increase_group_size', which is triggered first by the CloudWatch Alarm 'Publish_Alarm' defined earlier. Also depending on the size of the monitored CloudWatch custom metric 'Waiting for Publish', the Scaling Policy with Steps can add a difference number of instances to the group. The scaling policy sets the number of instances in group to 1 if there are between 1000 and 2000 items Waiting for Publish in the queue. It also sets the

Toolkit - Dynamic Content Queries

This post if part of a series about the  File System Toolkit  - a custom content delivery API for SDL Tridion. This post presents the Dynamic Content Query capability. The requirements for the Toolkit API are that it should be able to provide CustomMeta queries, pagination, and sorting -- all on the file system, without the use third party tools (database, search engines, indexers, etc). Therefore I had to implement a simple database engine and indexer -- which is described in more detail in post Writing My Own Database Engine . The querying logic does not make use of cache. This means the query logic is executed every time. When models are requested, the models are however retrieved using the ModelFactory and those are cached. Query Class This is the main class for dynamic content queries. It is the entry point into the execution logic of a query. The class takes as parameter a Criterion (presented below) which triggers the execution of query in all sub-criteria of a Criterio

Running sp_updatestats on AWS RDS database

Part of the maintenance tasks that I perform on a MSSQL Content Manager database is to run stored procedure sp_updatestats . exec sp_updatestats However, that is not supported on an AWS RDS instance. The error message below indicates that only the sa  account can perform this: Msg 15247 , Level 16 , State 1 , Procedure sp_updatestats, Line 15 [Batch Start Line 0 ] User does not have permission to perform this action. Instead there are several posts that suggest using UPDATE STATISTICS instead: https://dba.stackexchange.com/questions/145982/sp-updatestats-vs-update-statistics I stumbled upon the following post from 2008 (!!!), https://social.msdn.microsoft.com/Forums/sqlserver/en-US/186e3db0-fe37-4c31-b017-8e7c24d19697/spupdatestats-fails-to-run-with-permission-error-under-dbopriveleged-user , which describes a way to wrap the call to sp_updatestats and execute it under a different user: create procedure dbo.sp_updstats with execute as 'dbo' as