29

I'm trying to find a way to pass objects to the Azure Queue. I couldn't find a way to do this.

As I've seen I can pass string or byte array, which is not very comfortable for passing objects.

Is there anyway to pass custom objects to the Queue?

Thanks!

knightpfhor
  • 9,299
  • 3
  • 29
  • 42
Roman
  • 4,443
  • 14
  • 56
  • 81

6 Answers6

32

You can use the following classes as example:

 [Serializable]
    public abstract class BaseMessage
    {
        public byte[] ToBinary()
        {
            BinaryFormatter bf = new BinaryFormatter();
            byte[] output = null;
            using (MemoryStream ms = new MemoryStream())
            {
                ms.Position = 0;
                bf.Serialize(ms, this);
                output = ms.GetBuffer();
            }
            return output;
        }

        public static T FromMessage<T>(CloudQueueMessage m)
        {
            byte[] buffer = m.AsBytes;
            T returnValue = default(T);
            using (MemoryStream ms = new MemoryStream(buffer))
            {
                ms.Position = 0;
                BinaryFormatter bf = new BinaryFormatter();
                returnValue = (T)bf.Deserialize(ms);
            }
            return returnValue;
        }
    }

Then a StdQueue (a Queue that is strongly typed):

   public class StdQueue<T> where T : BaseMessage, new()
    {
        protected CloudQueue queue;

        public StdQueue(CloudQueue queue)
        {
            this.queue = queue;
        }

        public void AddMessage(T message)
        {
            CloudQueueMessage msg =
            new CloudQueueMessage(message.ToBinary());
            queue.AddMessage(msg);
        }

        public void DeleteMessage(CloudQueueMessage msg)
        {
            queue.DeleteMessage(msg);
        }

        public CloudQueueMessage GetMessage()
        {
            return queue.GetMessage(TimeSpan.FromSeconds(120));
        }
    }

Then, all you have to do is to inherit the BaseMessage:

[Serializable]
public class ParseTaskMessage : BaseMessage
{
    public Guid TaskId { get; set; }

    public string BlobReferenceString { get; set; }

    public DateTime TimeRequested { get; set; }
}

And make a queue that works with that message:

CloudStorageAccount acc;
            if (!CloudStorageAccount.TryParse(connectionString, out acc))
            {
                throw new ArgumentOutOfRangeException("connectionString", "Invalid connection string was introduced!");
            }
            CloudQueueClient clnt = acc.CreateCloudQueueClient();
            CloudQueue queue = clnt.GetQueueReference(processQueue);
            queue.CreateIfNotExist();
            this._queue = new StdQueue<ParseTaskMessage>(queue);

Hope this helps!

astaykov
  • 30,768
  • 3
  • 70
  • 86
  • Seems it's the solution most go with :) Thanks! – Roman Dec 18 '11 at 10:54
  • I know :) I use it in production ;) – astaykov Dec 18 '11 at 16:41
  • 9
    Pretty neat solution. But I'd say this violates Single Responsibility Principle: Serialisable POCO objects now take dependency on Azure library. I would not have messages inherit from `BaseMessage`, but rather have `ToBinary()` and `FromMessage()` be private inside of the `StdQueue` class. Objects should not really be responsible for serialisation/deserialisation of themselves. – trailmax Jul 12 '13 at 23:20
  • Hey astaykov, how would you store generic types with this? What if you had ParseTaskMessage, if you had to deserialize it, where would you get the type from? – Kakira Jun 04 '14 at 19:36
  • 1
    new versions of `Microsoft.WindowsAzure.Storage.Queue` does change the ctor `new CloudQueueMessage(byte[] content)` to a static method `CloudQueueMessage.CreateCloudQueueMessageFromByteArray(byte[] content)` – cyptus Mar 06 '19 at 07:02
21

Extension method that uses Newtonsoft.Json and async

    public static async Task AddMessageAsJsonAsync<T>(this CloudQueue cloudQueue, T objectToAdd)
    {
        var messageAsJson = JsonConvert.SerializeObject(objectToAdd);
        var cloudQueueMessage = new CloudQueueMessage(messageAsJson);
        await cloudQueue.AddMessageAsync(cloudQueueMessage);
    }
Akodo_Shado
  • 790
  • 7
  • 13
10

I like this generalization approach but I don't like having to put Serialize attribute on all the classes I might want to put in a message and derived them from a base (I might already have a base class too) so I used...

using System;
using System.Text;
using Microsoft.WindowsAzure.Storage.Queue;
using Newtonsoft.Json;

namespace Example.Queue
{
    public static class CloudQueueMessageExtensions
    {
        public static CloudQueueMessage Serialize(Object o)
        {
            var stringBuilder = new StringBuilder();
            stringBuilder.Append(o.GetType().FullName);
            stringBuilder.Append(':');
            stringBuilder.Append(JsonConvert.SerializeObject(o));
            return new CloudQueueMessage(stringBuilder.ToString());
        }

        public static T Deserialize<T>(this CloudQueueMessage m)
        {
            int indexOf = m.AsString.IndexOf(':');

            if (indexOf <= 0)
                throw new Exception(string.Format("Cannot deserialize into object of type {0}", 
                    typeof (T).FullName));

            string typeName = m.AsString.Substring(0, indexOf);
            string json = m.AsString.Substring(indexOf + 1);

            if (typeName != typeof (T).FullName)
            {
                throw new Exception(string.Format("Cannot deserialize object of type {0} into one of type {1}", 
                    typeName,
                    typeof (T).FullName));
            }

            return JsonConvert.DeserializeObject<T>(json);
        }
    }
}

e.g.

var myobject = new MyObject();
_queue.AddMessage( CloudQueueMessageExtensions.Serialize(myobject));

var myobject = _queue.GetMessage().Deserialize<MyObject>();
acro
  • 101
  • 1
  • 3
  • I like this approach, it is compact :) If using code above, reminder you may still need a reference to the original CloudQueueMessage in order to Delete it from the queue after reading. – Sid James Nov 18 '14 at 10:42
  • 1
    you actually only need MessageID and PopReciept properties of the original message. – astaykov Feb 16 '15 at 15:10
1

I liked @Akodo_Shado's approach to serialize with Newtonsoft.Json. I updated it for Azure.Storage.Queues and also added a "Retrieve and Delete" method that deserializes the object from the queue.

public static class CloudQueueExtensions
{
    public static async Task AddMessageAsJsonAsync<T>(this QueueClient queueClient, T objectToAdd) where T : class
    {
        string messageAsJson = JsonConvert.SerializeObject(objectToAdd);
        BinaryData cloudQueueMessage = new BinaryData(messageAsJson);
        await queueClient.SendMessageAsync(cloudQueueMessage);
    }

    public static async Task<T> RetreiveAndDeleteMessageAsObjectAsync<T>(this QueueClient queueClient) where T : class
    {

        QueueMessage[] retrievedMessage = await queueClient.ReceiveMessagesAsync(1);
        if (retrievedMessage.Length == 0) return null;
        string theMessage = retrievedMessage[0].MessageText;
        T instanceOfT = JsonConvert.DeserializeObject<T>(theMessage);
        await queueClient.DeleteMessageAsync(retrievedMessage[0].MessageId, retrievedMessage[0].PopReceipt);

        return instanceOfT;
    }
}

The RetreiveAndDeleteMessageAsObjectAsync is designed to process 1 message at time, but you could obviously rewrite to deserialize the full array of messages and return a ICollection<T> or similar.

Superman.Lopez
  • 1,332
  • 2
  • 11
  • 38
0

In case the storage queue is used with WebJob or Azure function (quite common scenario) then the current Azure SDK allows to use POCO object directly. See examples here:

Note: The SDK will automatically use Newtonsoft.Json for serialization/deserialization under the hood.

-12

That is not right way to do it. queues are not ment for storing object. you need to put object in blob or table (serialized). I believe queue messgae body has 64kb size limit with sdk1.5 and 8kb wih lower versions. Messgae body is ment to transfer crutial data for workera that pick it up only.

Puhek
  • 475
  • 3
  • 13
  • 1
    And why should I put my small enough object in Blob or Table, when the queue message can transfer it? You want me to overhead my storage stransactions with 50% (from 2 storage transactions - 1 to read message, 1 to delete it; to 3 - one to read message, 1 to delete message + 1 additional to read table entity or blob) ?? And blobs IMHO are for storing files, not serialized objects. Plus you also have the 64k limit for a Table entity, where the byte[] property may be just up to 64k - http://msdn.microsoft.com/en-us/library/windowsazure/dd179338.aspx! – astaykov Dec 18 '11 at 16:18
  • Well actually storage transactions will be increased at least twise (by 100%), because I have to first write what I want into blob/table, then send the message, then read the message, then read from blob/table, then delete the message. And if I can avoid this - I do. – astaykov Dec 18 '11 at 18:19
  • Sorry for a late response - but, I'd say that blobs are not jsut for storing files but for storing "long term -nonsearchable- data". Storing serialized (big) objects in blobs is a common practice on Azure. Meta data (and more) goes to tables. Queues have somewhat dual nature I do agree with that, but are certenly not meant as a storage. If you _know_ your object will not exceed certain size limit _and_ these object are not required in general to be accessible at any time - sure put them in the queue. But one needs to be aware that queue message can be visible or not, dequed for ever, etc. – Puhek Feb 17 '12 at 11:19
  • there is no way to be exact when the queue message will be available to get the data you need. That is why I'm saying that storing data in queue should be only data that is bound to that message operation _only_ and nothing else. But, as you have pointed out, in some cases it would probably be safe and better (transaction wise) to store data in queue message itself. – Puhek Feb 17 '12 at 11:21
  • While it's true that queues cannot permanently *store* objects, they work perfectly well *passing* objects (as long as they're serializable in some way). And this is a very valid and common use of queues. If it's possible to store the payload in the queue message, then by all means do so, especially if that object won't be needed permantently in its passed format. This has both speed and transaction advantages. You may have a *result* that should be stored in a blob (or an Azure Table, or SQL database table, or anywhere really), or maybe binary payload which could go to blob as-is... – David Makogon May 27 '13 at 12:28
  • I don't see why small objects like events cannot be stored in the queue. After all, they contain in most cases little information like an id. It's what the queue is meant for. – L-Four Sep 13 '13 at 11:59
  • 1
    Queues are made for passing data and instructions and what is being shown here is an excellent way of doing that. I didn't see anywhere that they were considering this for long term storage. I use a version of this using small instruction objects serialized to JSON to let processing systems know that there are new data files ready for processing. By using objects on both ends I maintain type control on a system that could fail with corrupt data. It also allows me to process the next set of files as soon as the system has some free capacity. – drobertson Mar 07 '16 at 04:09