Setting up Docker on Linode

A quick guide to setting up Docker on a Linode Ubuntu server. As I go deeper into exploring docker and building applications and services in new ways, I wanted a post that acts as a starting point.

Installing docker

$ sudo apt-get update

First, add the GPG key for the official Docker repository to the system:

$ curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add -

Add the Docker repository to APT sources:

$ sudo add-apt-repository "deb [arch=amd64] https://download.docker.com/linux/ubuntu $(lsb_release -cs) stable"

Then Install Docker

sudo apt-get install -y docker-ce

Check the Docker Status

sudo systemctl status docker

That's it docker is now running on our Linux server. I will be adding more posts to the blog as I explore docker more.

A quick guide to setting up Docker on a Linode Ubuntu server. As I go deeper into exploring docker and building applications and services in new ways, I wanted a post that acts as a starting point. Installing docker $ sudo apt-get update First, add the GPG key for the official…

Read More

NGINX hosting a site on Linode

I wanted to host my own personal site https://jameskenny.io on a Linode Linux server and run NGINX to host the static site.

I've used Linode before and really recommend them. Also will be doing some work with hosting docker containers. Over the next few blog posts I'll go into the details of how to do it.

Linode

Linode are a hosting company that offer Linux hosts with SSD hard drives. Linode offers some great management systems. It's quick and easy to start a server.

I create a new Linode a Linode 1GB in Frankfurt, Germany. This server will cost $5 a month. Not bad for my own site and a few other test ideas. I deployed with Ubuntu Linux.

Once it's deployed we go ahead and ssh onto the server.

Install NGINX

So on our Ubuntu Linux, we first run

sudo apt-get update

sudo apt-get install nginx

NGINX is now installed.

Setting up our site

My site is just a simple HTML page wiht some css behind it.

By default, NGINX expects your static files to be in a specific directory. You can override this in the configuration. But for what I'm doing the defaults are ok.

/var/www/

So I go ahead and create a new folder jameskenny.io and upload my files.

/var/www/jameskenny.io

Configure NGINX to serve the website

Next we need to setup the config on the NGINX to serve the sites.

/etc/nginx/

We're interested in two folders here.

  • sites-available this contains individual configuration files for all of your possible static websites.
  • sites-enabled this contains links to the configuration files that NGINX will actually read and run.

So in our 'sites-available' folder we create a new config file called

jameskenny.io

server {
  listen 80;
  
  listen [::]:80;
  
  root /var/www/jameskenny.io;
  
  index index.html;
  
  server_name jameskenny.io www.jameskenny.io;
  
  location / {
    try_files $uri $uri/ =404;
  }
}

Make sure to replace the jameskenny.io with your domain and folder path.

This file tells NGINX a few things:

  • Listen to any traffic on port 80
  • Deliver files from the folder /var/www/jameskenny.io
  • The main index page is called index.html.
  • Requests that are requesting jameskenny.io should be served by this server block.
  • Note the www is also listed separately. This tells nginx to also route requests starting with www to the site. There’s actually nothing special about the www — it’s treated like any other subdomain but you can see how you can use this for other subdomains very quickly.

Next we need to create a short cut in our sites-enabled folder. In our SSH shell we run the command.

ln -s /etc/nginx/sites-available/jameskenny.io /etc/nginx/sites-enabled/jameskenny.io

This will create a shortcut pointing to our config to tell NGINX to server our site.

Now we restart NGINX to load it all up ok.

sudo systemctl restart nginx

Just point your A record to your new server and go check out your site.

I wanted to host my own personal site https://jameskenny.io on a Linode Linux server and run NGINX to host the static site. I've used Linode before and really recommend them. Also will be doing some work with hosting docker containers. Over the next few blog posts I'll go…

Read More

Azure Functions with Continuous Deployment

A quick guide to setting up continuous deployment for Azure functions.

Continuous Deployment or Continuous Delivery allows us to deploy our application / code into the production or test based on each check in to our source control, this allows us to focus our craft and not on deployments.

Setting up CD for Azure functions is really a simple process.

Source code can be found here

Create a function

First we need some functions. I've created a simple GET function. I'm assuming you have setup functions and seen them in action already. If not check out my blog posts on Azure functions

Create a folder for your azure function.

Add a host.json file - Leave this empty for now.

{}

By default the host.json needs to have {} this leaves it empty, leaving it blank will stop the function from starting.

Create a folder for your first function. Each function should have a folder.

In this folder create 3 files.

  • project.json
{
    "frameworks": 
    {  
     "net46":
     { 
      "dependencies":
      {
        "Newtonsoft.Json": "10.0.3"
      }
     }
   }
}
  • function.json
{
    "disabled": false,
    "bindings": [
      {
        "authLevel": "anonymous",
        "name": "req",
        "type": "httpTrigger",
        "direction": "in"
      },
      {
        "name": "$return",
        "type": "http",
        "direction": "out"
      }
    ]
}

Note: I've created this function as a GET with anonymous authentication so we can call it with no keys.

  • run.csx
using System.Net;

public static async Task<HttpResponseMessage> Run(HttpRequestMessage req, TraceWriter log)
{
    log.Info("C# HTTP trigger function processed a request.");
    
    return req.CreateResponse(HttpStatusCode.OK, "Hello there");
}

Go ahead and commit the function to your source control.

Create a function

Back onto the Azure portal. Create a new function app.

Once created head to the "Platform Features" Tab

Azure-Function-platform

Select Deployment options and select "Setup"

azure-deployment-setup-2

Configure sources

Azure functions allows you to deploy from different sources.

azure-function-sources

Follow the steps to select the source you want. I am going with Github.

Once setup follow the steps to add your functions source. Then click sync or make a change and push to your source control.

Azure-Functions-Sync

That's it go ahead and test the function.

Azure-Functions-Test

When I run the test I get the hello response.

Azure makes it really easy to deploy from source control. There is the ability to add tests aswell, I'll cover that in another post.

A quick guide to setting up continuous deployment for Azure functions. Continuous Deployment or Continuous Delivery allows us to deploy our application / code into the production or test based on each check in to our source control, this allows us to focus our craft and not on deployments. Setting up…

Read More

Azure Functions Routes and Proxies

Azure Function Proxies allow you to create a single unifed API surface for your Azure functions. The Microsoft Azure stack allows you to use different technologies so you can use the right tool for the job.

Azure functions can also have routes. Unlike a Proxy a route will only effect the azure function it is on.

A proxy allows us to create a single clean set of endpoints and Azure handles all the routing to the right place.

Getting Started

I've created two simple Azure functions both are HTTP Trigger functions.

Source code here

GET - HTTPDEMOGET

We have a GET Function that will return an object from a pretend database.

POST - HTTPDEMOPOST

We have a POST function that will save an object somewhere.

Source for our functions is here

Azure function Routes

First we will look into the routing in azure functions.

POST Function

First lets set our routing on our POST function.

When we create a function we get a default route that looks something like this

https://serversncodefunctiondemo.azurewebsites.net/api/HttpDemoPost?code=SJ47E3DDWAMeWw2sRU9aKhFYJPFacCTdtA/K7qu5GH86U2JNdKD6jA==

Changing the route on our function is simple. In our function.json we have to add a property for route for example,

{
  "bindings": [
    {
      "authLevel": "function",
      "name": "req",
      "type": "httpTrigger",
      "direction": "in",
      "route": "serversncodedemo"
    },
    {
      "name": "$return",
      "type": "http",
      "direction": "out"
    }
  ],
  "disabled": false
}

I've modified my function to now have "route": "serversncodedemo"

This changes my URL to replace the httpdemopost part with serversncodedemo.

https://serversncodefunctiondemo.azurewebsites.net/api/serversncodedemo?code=SJ47E3DDWAMeWw2sRU9aKhFYJPFacCTdtA/K7qu5GH86U2JNdKD6jA==

Now when we use that URL and pass our Json payload. We have changed our functions route.

GET Function

For this demo our get function will take an ID and return the record from our magic database.

To set a function as a GET function is done in the function.json.

In the bindings, I've set my function to

route: "serversncodedemo/{id}"

So my function.json looks like this

    "bindings": [
      {
        "authLevel": "anonymous",
        "name": "req",
        "type": "httpTrigger",
        "direction": "in",
        "route": "serversncodedemo/{id}"
      },

Changing the authlevel is not required but I've done it for this demo.

Setting the route is what changes it to a GET, we're telling the function to expect id as part of the route to the function.

In our run.csx we have to add string id as a parameter into our function

public static async Task <HttpResponseMessage> Run(HttpRequestMessage req, string id, TraceWriter log)

We can now see it in a route like this

https://serversncodefunctiondemo.azurewebsites.net/api/serversncodedemo/{id}

We replace {id} with what ever we want and we will get a response.

We have now set the routes in our functions and these routes are the same serversncodefunctiondemo.azurewebsites.net/api/serversncodedemo/ we then have small changes for our POST and GET functions.

Azure Function Proxies

Proxies give us more control over all our functions or just selected methods instead of having to set the route in the function.json on each of our functions. Also if your using a custom domain it's used in Proxies. These are in Preview as of the creating of this post.

So lets create a proxy.

Create-Azure-Function-Proxy

On our Azure function create a new Azure function.

Define-Azure-Function-Proxy

I've created a Proxy called get,

We set a route template as person/{id} and then selected methods for the GET.

The backend URL is optional but in this case I set it to the same as the GET Function we created earlier.

https://serversncodefunctiondemo.azurewebsites.net/api/serversncodedemo/{id}

Click create and our Proxy is now created and ready.

Proxy-Detail

We have now created a proxy for our GET Function instead of having to use

https://serversncodefunctiondemo.azurewebsites.net/api/serversncodedemo/{id}

We can use

https://serversncodefunctiondemo.azurewebsites.net/person/{id}

Proxies and Routing provide a powerful and flexible layer on top of Azure functions to help organize your API.

A quick note on custom domains you can add a custom domain to Azure functions in the same way sas for Azure App service. But as of the writing of this post there are restrictions. I couldn't set it to a route domain but had to use a subdomain.

Source code here

Azure Function Proxies allow you to create a single unifed API surface for your Azure functions. The Microsoft Azure stack allows you to use different technologies so you can use the right tool for the job. Azure functions can also have routes. Unlike a Proxy a route will only effect…

Read More

Azure Functions with Table Storage

Using a HTTP Trigger Azure Function and storing the data into an Azure Table storage account. Azure Table storage is a service that stores structured NoSQL data in the cloud, providing a key/attribute store with a schemaless design.

You can find the code for this here

This post builds on some other posts I've done,

I've covered getting started on Azure Table Storage in an other post Azure Table Storage

I've also covered creating a HTTP Trigger Azure Function

So let's get too it.

Adding Table Storage

In the project.json file we need to add the nuget reference for "WindowsAzure.Storage" as of writing this post the current version is 8.4.0. Our project.json will look something like this.

{
    "frameworks": 
    {  
     "net46":
     { 
      "dependencies":
      {
        "Newtonsoft.Json": "10.0.3",

        "WindowsAzure.Storage": "8.4.0"
      }
     }
   }
}

I've also got "Newtonsoft.Json" there because I want to use the Deserialize on the payload.

HTTP Trigger Function

Now over to our function in the run.csx and first we need our classes I have two. Our function will look something like this

using System.Net;
using Newtonsoft.Json;
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Table;

public static async Task<HttpResponseMessage> Run(HttpRequestMessage req, TraceWriter log)
{
    log.Info("C# HTTP trigger function processed a request.");    

    dynamic body = await req.Content.ReadAsStringAsync();
    var e = JsonConvert.DeserializeObject<Person>(body as string);

    // Define the row,
    string sRow = e.email + e.lastname;

    // Create the Entity and set the partition to signup, 
    PersonEntity _person = new PersonEntity("signup", sRow);

    _person.First_Name_VC = e.firstname;
    _person.Last_Name_VC = e.lastname;
    _person.Email_VC = e.email;

    // Connect to the Storage account.
    CloudStorageAccount storageAccount = CloudStorageAccount.Parse("XXX");

    CloudTableClient tableClient = storageAccount.CreateCloudTableClient();

    CloudTable table = tableClient.GetTableReference("personitems");

    table.CreateIfNotExists();

    TableOperation insertOperation = TableOperation.Insert(_person);

    table.Execute(insertOperation);

    return req.CreateResponse(HttpStatusCode.OK, "Ok");
}

public class Person{
    public string firstname {get;set;}
    public string lastname {get;set;}
    public string email {get;set;}
}

public class PersonEntity : TableEntity
{
    public PersonEntity(string skey, string srow)
    {
        this.PartitionKey = skey;
        this.RowKey = srow;
    }

    public PersonEntity() { }

    public string First_Name_VC { get; set; }
    public string Last_Name_VC { get; set; }
    public string Email_VC { get; set;}
}

So lets break it down and take a look

Person Class and PersonEntity

At the bottom of our Function we add our classes we want to use.

public class Person{
    public string firstname {get;set;}
    public string lastname {get;set;}
    public string email {get;set;}
}

This person class is for the payload.

public class PersonEntity : TableEntity
{
    public PersonEntity(string skey, string srow)
    {
        this.PartitionKey = skey;
        this.RowKey = srow
    }

    public PersonEntity() { }
    public string First_Name_VC { get; set; }
    public string Last_Name_VC { get; set; }
    public string Email_VC { get; set;}
}

This is our Table storage entity.

References

Next at the top we need to add our references

using System.Net;
using Newtonsoft.Json;
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Table;

From HTTP Trigger to Table Storage

Now that's the house keeping done, lets walk through our function and save our information into Azure table storage.

public static async Task<HttpResponseMessage> Run(HttpRequestMessage req, TraceWriter log)
{
    log.Info("C# HTTP trigger function processed a request.");    

    dynamic body = await req.Content.ReadAsStringAsync();
    var e = JsonConvert.DeserializeObject<Person>(body as string);

    // Define the row,
    string sRow = e.email + e.lastname;

    // Create the Entity and set the partition to signup, 
    PersonEntity _person = new PersonEntity("signup", sRow);

    _person.First_Name_VC = e.firstname;
    _person.Last_Name_VC = e.lastname;
    _person.Email_VC = e.email;

    // Connect to the Storage account.
    CloudStorageAccount storageAccount = CloudStorageAccount.Parse("XXX");

    CloudTableClient tableClient = storageAccount.CreateCloudTableClient();

    CloudTable table = tableClient.GetTableReference("personitems");

    table.CreateIfNotExists();

    TableOperation insertOperation = TableOperation.Insert(_person);

    table.Execute(insertOperation);

    return req.CreateResponse(HttpStatusCode.OK, "Ok");
}

First up we want to Deserialise our payload and turn it into an object.


    dynamic body = await req.Content.ReadAsStringAsync();
    var e = JsonConvert.DeserializeObject<Person>(body as string);

dynamic body reads the Content of the HttpRequestMessage req.
We tehn pass that into the JsonConvert as a string. JsonConvert DeserializeOject turns that into an object we can use.

Next we create our table entity.

    // Define the row,
    string sRow = e.email + e.lastname;

    // Create the Entity and set the partition to signup, 
    PersonEntity _person = new PersonEntity("signup", sRow);

    _person.First_Name_VC = e.firstname;
    _person.Last_Name_VC = e.lastname;
    _person.Email_VC = e.email;

Table Storage uses a partition key and row key for each object. That's up to you to set. In this case I have set the partition to signup. The RowKey should be unique to the partition. I've set by combining the email and last name. In a real world I would only allow the email in my Table storage once, so I would validate I had no new ones. So email could a rowkey you can set it according to your use case.

Next we setup our Person object and set the properties.

    // Connect to the Storage account.
    CloudStorageAccount storageAccount = CloudStorageAccount.Parse("XXX");

    CloudTableClient tableClient = storageAccount.CreateCloudTableClient();

    CloudTable table = tableClient.GetTableReference("personitems");

    table.CreateIfNotExists();

    TableOperation insertOperation = TableOperation.Insert(_person);

    table.Execute(insertOperation);

We connect to our storage account, load our table of "personitems" I've added a check to create if not exists.

We then call the tableOperation and insert our new record to the database.

Wrap up

To end I have the function return OK, in the real world you might want more information here.

    return req.CreateResponse(HttpStatusCode.OK, "Ok");

We now have a function that takes a HTTP payload and stores it into an Azure function.

Expanding Functions

One of the great things about functions is you can create bindings on other things. So you could have another function that runs when an entry is made in Azure Table Storage. That trigger would do another job with the new person that has just signed up.

That's it for this one for more on Azure Functions and serverless check out the series of posts I've created here

Using a HTTP Trigger Azure Function and storing the data into an Azure Table storage account. Azure Table storage is a service that stores structured NoSQL data in the cloud, providing a key/attribute store with a schemaless design. You can find the code for this here This post builds…

Read More