× close Kiandra IT Logo Kiandra IT
Back to blog

Forgetting is easy, coding is hard


This post first appeared on LinkedIn, published by Kane Barton, Account Manager, Kiandra IT

I made the transition from a coding day job to something less technical years ago and could not be happier. That said, I do, from time to time, miss the creative expression that my coding past provided. Would I ever want to go back? No thanks!

However, after spending over half my adult life writing code I am not yet ready to become one of the “I’m not technical” technology consultants. For me, it is important that I retain a level of understanding on what it really takes to write code. It makes me a more rounded consultant. More importantly, it allows me to understand at a deeper level, the complexities and challenges our teams face on a daily basis.

I believe that we as people, too often, lose objectivity on how challenging something is to develop and have a predefined position on how long something “should” take to complete — I know I have suffered from this in the past.

So, I decided to fix my self-made concern by dusting off the keyboard and writing some code.

After taking some inspiration from the #serverless movement, I decided to write my first Serverless function (whereby an application or function executes in the cloud without having to worry about where the application is installed, runs or scales). I wanted to keep my Serverless function relevant to a typical business problem the Kiandra team solve, and decided to test myself with a function that converts a JSON document to Excel worksheet. Note: If this is getting too techy, an example of when we solve this sort of problem could include a scenario where a client is capturing demographic information via a sign-up page on their website, and needs to be able to download this information into a spreadsheet.

Code onward…

I started by creating a new class library project in Visual Studio:

Then using NuGet, I added a reference to Newtonsoft.Json, as I would use this library to parse JSON data:

Next, I needed a library to help create the Excel file. For this example, I used the free version of the Spire.Office component (you can substitute Spire’s library for your preferred choice):

I added a new class called “JsonToExcel.cs” wrote the following code. Most of the code should be self-explanatory, and for the purpose of this article I will skip the detailed explanation:

using Newtonsoft.Json.Linq;
using Spire.Xls;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Text.RegularExpressions;
 
namespace ExcelFunction
{
    public class JsonToExcel
    {
        private readonly Workbook _workBook;
        private readonly Dictionary<string, int> _columnIndexDictionary = new Dictionary<string, int>();
 
        public JsonToExcel(string worksheetName)
        {
            _workBook = new Workbook { Version = ExcelVersion.Version2013 };
            _workBook.Worksheets.Clear();
 
            ExcelWorkSheet = _workBook.Worksheets.Add(worksheetName);
 
            Row = 1;
        }
 
        private Worksheet ExcelWorkSheet { get; }
 
        private int Row { get; set; }
 
        public Stream Export(string jsonData)
        {
            var data = JArray.Parse(jsonData).Children<JObject>();
 
            var firstItem = data.FirstOrDefault();
            if (firstItem == null)
            {
                return null;
            }
 
            WireUpColumnsNames(firstItem);
 
            // add in our cloumn headers, using the first item in the json array
            foreach (var prop in firstItem.Properties())
            {
                Append(prop.Name, SplitPascalCase(prop.Name));
            }
            NewRow();
 
            // now populate our excel worksheet with
            // each of the items in the json array
            foreach (var item in data)
            {
                foreach (var prop in item.Properties())
                {
                    Append(prop.Name, prop.Value);
                }
                NewRow();
            }
 
            var result = new MemoryStream();
            _workBook.SaveToStream(result);
            result.Seek(0, SeekOrigin.Begin);
            result.Position = 0;
 
            return result;
        }
 
        private void WireUpColumnsNames(JObject firstItem)
        {
            int columnIndex = 1;
            foreach (var prop in firstItem.Properties())
            {
                _columnIndexDictionary.Add(prop.Name, columnIndex);
                columnIndex++;
            }
        }
 
        private void Append(string columnName, object value, bool toLocalDate = true)
        {
            if (value == null)
            {
                value = string.Empty;
            }
 
            var column = 1;
            if (columnName != string.Empty && _columnIndexDictionary.ContainsKey(columnName))
            {
                column = _columnIndexDictionary[columnName];
            }
 
            ExcelWorkSheet.Range[Row, column].Value = value.ToString();
        }
 
        public void NewRow()
        {
            Row++;
        }
 
        // these two functions will turn 'pascalCaseVaraibles' into a more friendly
        // 'Pascal Case Variables' output, as I feel this looks nicer
        public static string SplitPascalCase(string value)
        {
            value = FirstLetterToUpper(value);
            return Regex.Replace(value, "([A-Z])", " $1", RegexOptions.Compiled).Trim();
        }
 
        private static string FirstLetterToUpper(string str)
        {
            if (str == null)
            {
                return null;
            }
 
            if (str.Length > 1)
            {
                return char.ToUpper(str[0]) + str.Substring(1);
            }
 
            return str.ToUpper();
        }
    }
}

As this is an example only post, I will skip the CI and CD steps and instead create and deploy the Azure function manually. I started by signing into my Azure BizSpark account and created a new Function App:

I then created a custom function:

After the skeleton of the Function App was created, I needed to make the class library that converts from JSON to Excel available to the function. There are a number of different ways to achieve this and for this example, I decided on using the Kudu tools:

Using the Kudu debug command console I navigated to the Function App folder:

I then created a ‘bin’ folder and uploaded the complied class library:

Then I switched back to the Function App and wrote the following code:

// allow the Function App to reference our class library
#r ".\bin\ExcelFunction.dll"
 
using System;
using System.Linq;
using System.Net;
using System.Net.Http;
using System.Net.Http.Headers;
using System.Web.Http;
 
public static async Task<HttpResponseMessage> Run(HttpRequestMessage req, TraceWriter log)
{
    log.Info("SimpleExcelConverter C# HTTP trigger function processed a request.");
 
    // parse querystring parameter, this will be the name of the worksheet
    string worksheetName = req.GetQueryNameValuePairs()
        .FirstOrDefault(q => string.Compare(q.Key, "name", true) == 0)
        .Value;
 
    log.Info($"Parameter.Name={worksheetName}");
 
    // read the post body as Json dataa
    string jsonData = await req.Content.ReadAsStringAsync();
 
    // for this example we can output it to the console, as this will
    // help us with debugging
    log.Info($"Body={jsonData}");
 
    // create an instance of our class library
    var reportBuilder = new ExcelFunction.JsonToExcel(worksheetName);
 
    // call our method to convert Json to Excel
    var documentStream = reportBuilder.Export(jsonData);
 
    // return the file as a stream content
    var result = new HttpResponseMessage(HttpStatusCode.OK) { Content = new StreamContent(documentStream) };
    result.Content.Headers.ContentType = new MediaTypeHeaderValue("application/vnd.openxmlformats-officedocument.spreadsheetml.sheet");
    result.Content.Headers.ContentDisposition = new ContentDispositionHeaderValue("attachment")
    {
        FileName = $"SimpleExcelConverter_{worksheetName}_{DateTime.Now.Ticks}.xlsx",
        Size = documentStream.Length
    };
 
    return result;
}

I then used the Function App test console with this JSON data example:

 [
  {
    "colorName": "red",
    "hexValue": "#f00"
  },
  {
    "colorName": "green",
    "hexValue": "#0f0"
  },
  {
    "colorName": "blue",
    "hexValue": "#00f"
  },
  {
    "colorName": "cyan",
    "hexValue": "#0ff"
  },
  {
    "colorName": "magenta",
    "hexValue": "#f0f"
  },
  {
    "colorName": "yellow",
    "hexValue": "#ff0"
  },
  {
    "colorName": "black",
    "hexValue": "#000"
  }
]

Here is the console output:

The Azure test console is great, but I really wanted to see the output in Excel. So I loaded up Postman and configured my call:

(tip: use the Use the </> Get function URL in the Azure test console)

Here is the output in Excel:

As you would imagine the code is an example only and not production ready. I developed my example using Visual Studio 2017 (15.2) so for those of you who have the Visual Studio 2017 Tools for Azure life will be a little easier.

So what did I learn?

  • Writing code in a relaxed environment (i.e. without pressure) can be fun
  •  Writing production ready, security hardened and fault tolerant code is time consuming and much harder than I remembered
  • New advances in technology, like Serverless, open up a wealth of new opportunities
  • If you’re software development partner is not making you aware of these new advances in technology you should be asking questions about them.

I hope you enjoyed this post and I would like to call out special thanks to: