Intro

In previous post I have described how to work blob snapshots. In this post I am going to show how to work with block blobs.

Prerequisites

Refer to previous posts to get started with Azure Blob Service.

Code

Block blobs are optimized for uploading and downloading large amounts of data efficiently. For most scenarios requiring a blob, a block blob is what you will want to use.

Consider the following example:

var test = container.GetBlockBlobReference("test.txt");
test.UploadText("test content");

This code will create a new blob with content. It is also posible to create blob from a byte array, a file or a stream using UploadFromByteArray, UploadFromFile or UploadFromStream corresponding commands. If you use Azure Storage Library you don't have to worry about size of the blob you are uploading, Azure Storage Library will automaticaly upload it in chunks.

The following code will upload 100MB of data:

File.WriteAllBytes("C:\\temp\\test.dat", new byte[1024 * 1024 * 100]);
var test = container.GetBlockBlobReference("test.dat");
test.UploadFromFile("C:\\temp\\test.dat");

It is also possible to download data:

var test = container.GetBlockBlobReference("test.dat");
test.DownloadToFile("C:\\temp\\test2.dat", FileMode.Create);

This code downloads the blob to test2.dat file. It is possible to download a blob to a byte array, a stream, a string using corresponding methods.

It is also possible to work with a blob as a stream:

var test = container.GetBlockBlobReference("test1.dat");
var buffer = new byte[1024];
using (var stream = test.OpenWrite())
{
    for (var i = 0; i < 1024 * 1024; i++)
    {
        r.NextBytes(buffer);
        stream.Write(buffer, 0, 1024);
    }
}

This code will write 1GB of random data to test1.dat blob in 4MB chunks.

The following code will read data in 4MB chunks:

var test = container.GetBlockBlobReference("test1.dat");
var buffer = new byte[1024];
using (var stream = test.OpenRead())
{
    while (stream.Read(buffer, 0, 1024) != 0)
    {
    }
}

It is possible to work with a block blob as a set of individual blocks.

Lets download a list of blocks using the following code:

var test = container.GetBlockBlobReference("test.dat");
var blocks = test.DownloadBlockList();
Console.WriteLine(blocks.Count());
foreach (var block in blocks)
{
    Console.WriteLine(block.Name);
}

This code shows number of blocks and block names. Now we can download and modify an individual block:

var test = container.GetBlockBlobReference("test.dat");
var blocks = test.DownloadBlockList().ToList();
var block = blocks.Skip(10).Take(1).FirstOrDefault();
var buff = new byte[block.Length];
var blobOffset = blocks.Take(10).Sum(_ => _.Length);
test.DownloadRangeToByteArray(buff, 0, blobOffset, block.Length);

for (var i = 0; i < buff.Length; i++)
{
    buff[i]++;
}

test.PutBlock(block.Name, new MemoryStream(buff), null);
test.PutBlockList(blocks.Select(_ => _.Name).ToList());

This code calculates the start position of 11-th block and downloads it to a byte array. Then it modifies the block and puts it back to a blob on server. It is very important to call PutBlockList to our changes to the list of blobs.

Summary

In this post I started covering Azure Blob Service blobs and showed how to work with block blobs. To get more infromation about block blobs vist About Block Blobs. In the next post I am going to show how to work with page blobs.


;