Institute of Computer Science
  1. Courses
  2. 2021/22 spring
  3. Cloud Computing (LTAT.06.008)
ET
Log in

Cloud Computing 2021/22 spring

  • Main
  • Lectures
  • Practicals
    • Plagiarism Policy
  • Submit Homework

Practice 14 - Application deployment in Azure Cloud Services

In this practice session, you will learn about application deployment on Azure Public Cloud services. The application can use multiple services such as object and database storage, web application hosting, and further scalability services. Azure provides a variety of services enabling the faster design and deployment of the applications. The aim of the lab is to design and deploy the message board with extended features (image upload, store and access) application using Azure services.

References

  1. Azure Cloud documentation: https://docs.microsoft.com/en-us/azure/?product=featured
  2. Azure Functions:https://azure.microsoft.com/en-us/services/functions/
  3. Azure Blob Storage: https://azure.microsoft.com/en-us/services/storage/blobs/
  4. Azure COSMOS DB : https://cosmos.azure.com/

Exercise 14.1 Introduction to application and setting up of development machine

In this practice session, we are going to re-use the message-board application code (messages stored locally in data.json) developed in Practice Session 2. We will extend the message board application by using different Azure services as mentioned below.

  1. Update the home.html with an option to upload images along with text messages and further store the image into the 'Azure Blob Storage service and finally, access and display the same image on the web page along with messages.
  2. Images uploaded by the end-users would be in different sizes, so design a resize (thumbnail) azure function to resize the image whenever uploaded into the blob storage. The resize function should be triggered when a new image arrived in the Azure Blob Storage. The output image of resizing function should be stored in the blob storage.
  3. Use the Azure COSMOS DB (No SQL database) to store the message data as a JSON document.
  4. Finally, create and deploy a message board application to 'Azure App Services.

We will use a VM as a development machine to develop the application and work with Azure tools and services. Launch an OpenStack VM, install azure CLI, and get ready with the existing message board application code.

  • Use your last name as a part of the instance name!
  • Source: Use Volume Snapshot, choose Ubuntu 20 + Docker 20.10
    • in this Ubuntu-based snapshot, the installation of Docker (as we did in Lab 2) has already been done for us.
    • Enable "Delete Volume on Instance Delete"
  • Flavour: should be m2.tiny
  • Install python virtual environment and Create a python virtual environment and activate.
  • Install Azure CLI as mentioned in Lab 13 (Exercise 13.5)) and login to your azure account using command az login.
  • Clone the message board code git clone https://bitbucket.org/jaks6/cloud-computing-2022-lab-1.git webapp or you can reuse the code you submitted in the Practice 2.

Now, modify the application as per below instructions:

  • Update home.html so that users can upload an an image along with their message. The image should end up being stored on the disk. (Make sure your name is displayed on the web page).
    • For example, you can follow the approach explained in this guide ( However, do not include the parts about using .env, "flash" and "flash messages" , they can cause conflicts when deploying to Azure later).
  • Make sure you update home.html to display the saved image along with the text message.
    • Tip: Update the append_message_to_file function, so that when messages are stored to disk (data.json file), then the filepath of the image is also saved in addition to the message contents (call the new property 'img_path').

After these updates, the application should work something like this:

  • Test the application.

Exercise 14.2 Working with the Azure Blob Storage

In this task, we are going to work with the Azure Blob storage. Blob storage is optimized for storing massive amounts of unstructured data. Unstructured data is data that doesn't adhere to a particular data model or definition, such as text or binary data. Here, we are going to store the images uploaded by the end-user in azure blob storage.

  • Add python SDK for azure blob storage azure-storage-blob==12.3.1 in requirements.txt.
  • Let us create an Azure resource group, storage account, and containers to store the images.
    • Use the guidelines from this manual and follow accordingly
      • Create a resource group with the name lab14 and location should be northeurope
      • Create a storage account as your last name<storage-account>
      • Create a storage container with the name raw.
         az storage container create \
            --account-name <storage-account> \
            --name <container> 
      • Make the blobs available with public access. Please refer to the az command here
      • Note down, the account key from your storage account using az storage account keys list --account-name <STORAGE_ACCOUNT>. Consider the key1 part, we will use it as connection string in the next steps.
      • Set STORAGE_ACCOUNT=<STORAGE_ACCOUNT>, CONN_STRING=<CONN_STRING> as an environment variables
  • Now, let us update app.py to use azure blobs
    • Import the Azure blob storage library from azure.storage.blob import BlobServiceClient
    • Read environment variables connect_str= os.getenv('CONN_STRING'), storage account name storage_account = os.getenv('STORAGE_ACCOUNT') and declare variable raw_images_container='raw'
    • Create connection client blob_service_client = BlobServiceClient(account_url="https://"+storage_account+".blob.core.windows.net/",credential=connect_str)
    • Now, write a function insert_blob in app.py to upload the image into Azure blob.
      • The function receive input name as filename insert_blob(img_path)
      • Get the filename of the image filename = (img_path).split('/')[-1]
      • You can refer to this document for an example to upload an item to blob using the Azure blob python library.
        • Create a blob client using the local file name as the name for the blob blob_client = blob_service_client.get_blob_client(container=raw_images_container, blob=filename)
        • Upload the image file to blob. (Refer # Upload the created file in the document link)
    • Update home(): method
      • Now call the function insert_blob(img_path) with saved image path as argument, received from end user from the web page.
      • Further, let's change how the append_message_to_file gets called in home(). We will update the img_path value in data.json to store the image Blob Storage URL instead of the local disk path. For this, you can change the 2nd argument of append_message_to_file to: append_message_to_file(new_message, blob_url
        • For blob_url, use the following value: https://'+storage_account+'.blob.core.windows.net/'+raw_images_container+'/'+(img_path).split('/')[-1]
  • Install the requirements.txt packages
  • Test the application.
  • Deliverable: Take screenshot of the web page of application containing message and image (Your IP should be visible).

Exercise 14.3 Working with Blob Triggers and Azure functions

In this task, we will create a function to resize the image stored in the Azure blob. The Azure Functions can be triggered when new items are inserted into the azure blob. Azure functions support invocations based on the multiple types of triggers based on the events. Here, we write an azure function that is triggered by azure storage container insert event, to generate a thumbnail (resize) for the new image inserted into the blob, and then the output is stored again in one more container.

  • Create a container with the name processed (also, make it public like in Task 14.2 ) to store the resized images.
  • Install Azure Function Core Tools command-line tools as mentioned in the document
  • To make sure everything is set up correctly, let's look at what templates for Azure features are available. They can be used as a basis for creating functions func templates list -l python
  • Find the Azure function App on the Azure portal to create Function App (PS!! You can also create using command like az functionapp create --consumption-plan-location northeurope --name <FUNCTION_APP_NAME> --os-type Linux --resource-group <RESOURCE_GROUP> --runtime python --runtime-version 3.8 --storage-account <STORAGE_ACCOUNT>)
    • Create a new Azure Function App. It works as a container for the Azure feature.
      • Set the Resource Group to the previous Resource Group that you used in the second task.
      • FUNCTION_APP_NAME : Name it the name you chose, but we'll need it later ( FUNCTION_APP_NAME)
      • RESOURCE_GROUP : Resource group you created in previous task
      • STORAGE_ACCOUNT: Storage account name you created in the previous task
  • Make directory in the development machine mkdir ~/function && cd ~/function
  • Create a new local Azure Features service project that uses the Python environment:
    • func init localhsproj --python
  • Let's move to the created folder cd localhsproj
    • Create a function func new --name FUNCTION_NAME --template "Azure Blob Storage trigger"
      • NB! Replace FUNCTION_NAME with the name of your choice.
      • We choose the base type of the function: Azure Blob Storage triggertemplate - this means that this feature is designed to start when Azure Blob service events occur (such as when a new blog is uploaded).
      • As a result, a FUNCTION_NAME named folder is created
    • Move to the FUNCTION_NAME directory, we will modify the contents of the function.json file in bindings :
      • Set connection to AzureWebJobsStorage
        • Through the AzureWebJobsStorage environment variable, our feature gains access to Azure blob storage connection information that can be used to create a new connection (such as creating or modifying a file)
      • Update the path value from samples-workitems/{name} to the location of our Azure Blog Storage container: raw/{name}
        • It also configures the integration raw between the blob container and the function we create. This function is started for each new file.
      • Update local.settings.json, with value of AzureWebJobsStorage to "DefaultEndpointsProtocol=https;AccountName=<STORAGE_ACCOUNT>;AccountKey=<CONN_STRING>;EndpointSuffix=core.windows.net"
  • Run the function app locally:
    • Move to the Function App root folder ( localhsproj )
    • Install all the Python libraries specified inside the requirements.txt file: pip install -r requirements.txt
    • Start the function service locally: func host start
  • Now, let us modify the function to resize the new image and store it in to processed blob container
    • Let's modify requirements.txt the file to add a Python library for handling images:
      • add a row Pillow for the library
    • Move to the function directory and edit the function.json file
      • This file defines where the function takes inputs and what its outputs are.
      • Currently, only input is defined in the file (myblob, which is of type blobTrigger).
        • This means that any objects appearing under the container path "raw/{name}" get passed to the function under the parameter name "myblob"
    • Add one more element to the bindings list of the json:
      •     {
              "name": "blobout",
              "type": "blob",
              "direction": "out",
              "path": "processed/{name}",
              "connection": "AzureWebJobsStorage"
            }
        
      • This tells Azure that any result written to "blobout" should be stored to the Blob Storage container path "processed/{name}".
    • We implement __init__.py the contents of the file
      • Import the required libraries:
        • from io import BytesIO
        • from PIL import Image
    • Let's change main() the method entry to include an output object reference and a context object reference in the arguments.
      • It should now look like this:
        • main(myblob: func.InputStream, blobout: func.Out[func.InputStream]):
    • Let's change main() the content of the method:
      • Get the filename filename = (myblob.name).split('/')[1]
      • Read the input image input_image = myblob
      • Declare name for output image output_image_name = "/tmp/"+filename
      • Read the image base_image = Image.open(input_image)
      • Define size of thumbnail new_size = 150,150
      • Create thumbnail base_image.thumbnail(new_size)
      • Save thumbnail image base_image.save(output_image_name, quality=95)
      • Set the thumbnail to output blob new_thumbfile = open(output_image_name,"rb") and blobout.set(new_thumbfile.read())
  • Now, let us test the function locally
    • Make sure that your under directory function/localhsproj
    • Install the pip packages mentioned in the requirements.txt
    • launch the function func host start
    • If you find errors such as Exception: ModuleNotFoundError: No module named 'PIL'., than make sure that pip package Pillow is mentioned in requirements.txt.
  • Open your "raw" Container in Azure Portal, upload an image. You should see your locally running function react to it.
    • Check the "processed" Container, the scaled image should appear there.

If the function works, publish the function to Azure cloud, so we don't need to run it locally anymore.

  • func azure functionapp publish <FUNCTION_APP_NAME> --build remote --python
  • Replace FUNCTION_APP_NAME your function with the name of the application that we previously created in the Azure portal through the web interface.
  • Move on to the directory webapp
  • To display the thumbnail (resize) images in the web pages, you have to update the webapp/app.py
    • Update append_message_to_file so that instead of raw_images_container the value "processed" is used.
  • Finally, start the flask application
    • If you display with error like TypeError: can only concatenate str (not "NoneType") to str, make sure that you have set the environment variables (CONN_STRING,STORAGE_ACCOUNT)
  • Open the message-board application
    • Add messages and images
    • Now you should see the resized images on the web page
  • Deliverable: Take screenshot of the web page of application containing message and image (Your IP should be visible).

Exercise 14.4 Working with COSMOS Database

Azure Cosmos DB is a fully managed NoSQL database service for modern app development. In this task, we are going to use COSMOS DB to store the messages in the database. We are going use COSMOS python SDK to interact with azure cosmos db.

We are modifying the message-board flask application to store the messages in the database instead of locally at data.json. Here we modify app.py to interact with cosmos db.

  • Make sure that your under directory webapp
  • Add entry of azure-cosmos in requirements.txt
    • Don't forget to do pip install
  • Create a COSMOS DB Service in Azure portal
    • Search for Azure COSMOS DB Service in Azure portal
    • Click on Create --> Core (SQL) - Recommended --> Create
      • Set the Resource Group to the previous Resource Group you created
      • Account Name: Name it a name of your choice.
      • Region: North Europe
      • Click on Review+Create (It takes 2-3 minutes to create) (PS!! If you experience an error message "Sorry, we are currently experiencing high demand in this region ..." on service creation, than choose the different Region for example Sweden Central )
  • Now let us get the endpoint URL and Master key to access the COSMOS DB using python SDK
    • Replace <RESOURCE_GROUP> and <ACCOUNT_NAME> in the command
      • export COSMOS_ACCOUNT_URI=$(az cosmosdb show --resource-group <RESOURCE_GROUP> --name <ACCOUNT_NAME> --query documentEndpoint --output tsv)
      • export MASTER_KEY=$(az cosmosdb keys list --resource-group <RESOURCE_GROUP> --name <ACCOUNT_NAME> --query primaryMasterKey --output tsv)
  • Now let us modify the app.py
    • Import the cosmos lib import azure.cosmos.cosmos_client as cosmos_client and

from azure.cosmos.partition_key import PartitionKey

  • Declare HOST= os.getenv('COSMOS_ACCOUNT_URI')
  • Declare Master Key MASTER_KEY= os.getenv('MASTER_KEY')
  • Declare the variables
    • This is data base name DATABASE_ID = "<FREELY_CHOOSE>"
    • This is container CONTAINER_ID = "<FREELY_CHOOSE>"
  • Create cosmos db client client = cosmos_client.CosmosClient(HOST, {'masterKey': MASTER_KEY} )
  • Create a data base db = client.create_database_if_not_exists(id=DATABASE_ID)
  • Create a container container = db.create_container_if_not_exists(id=CONTAINER_ID, partition_key=PartitionKey(path='/id', kind='Hash'))
  • Write a function insert_cosmos, which takes content and img_path as input it is as similar to append_message_to_file
    • Declare json dictionary new_message as like in append_message_to_file method with additional json attribute
      • With key id and value can be generated using uuid.uuid4()
      • The json document looks like:
    new_message =  {
        'id': str(uuid.uuid4()), 
        'content': content,
        'img_path': img_path,
        'timestamp': datetime.now().isoformat(" ", "seconds")
    }
  • Insert a item in to the data base
    try:
        container.create_item(body=new_message)
    except exceptions.CosmosResourceExistsError:
        print("Resource already exists, didn't insert message.")
  • (The above also needs import azure.cosmos.exceptions as exceptions )
  • Write a function to read the messages from the cosmos read_cosmos as similar to read_messages_from_file
    • Declare list to store the messages messages = []
    • Read the items from cosmos item_list = list(container.read_all_items(max_item_count=10))
    • return the data item_list
  • Now, update the home function of app.py
    • Replace read_messages_from_file() with read_cosmos
    • Replace append_message_to_file with insert_cosmos
    • Update render_template-s 2nd argument to "data" instead of data["messages"]
  • Test the application
  • Now build the docker image as performed in Practice 2.
  • Push the docker image to docker hub with image name lab14.v1
  • Run the container with environment variables (CONN_STRING, STORAGE_ACCOUNT, COSMOS_ACCOUNT_URI, MASTER_KEY)
  • Test the application by inserting the messages and images
    • Deliverable: Take screenshot of the web page of the application containing message and image (Your IP should be visible).

Exercise 14.5 Deploying application in Azure App Service

Azure App Service is a fully managed platform for building web applications. In this staks, we are going to deploy the message-boad application in to app service.

  • Search for App Services in the azure portal.
  • Click on Create app service
    • Choose Resource Group to your resource group created in the earlier task.
    • Choose Name freely
    • Select Publish --> Docker
    • Region --> North Europe
  • Click on Next:Docker>
    • Image Source --> Docker Hub
    • Image and tag --> Provide your docker hub image pushed in earlier task
  • Now, it will take few minutes to deploy application
  • After this, add environment variables (required for docker container run) as mentioned in the document
    • Here add these variables (STORAGE_ACCOUNT, COSMOS_ACCOUNT_URI) in Application settings and (CONN_STRING,MASTER_KEY) in Connection Strings
    • You need to modify app.py to access these variables. You should prefix with APPSETTING_ for appl;ication setting variables. For example APPSETTING_COSMOS_ACCOUNT_URI and for the connection string, should prefix with CUSTOMCONNSTR_.
    • Build and push the image to docker hub after modifications.
  • Restart the app service. (After restarting app service, newly updated docker image will be downloaded automatically)
  • You can enable the logs as mentioned here
  • Get the logs using in your terminal az webapp log tail --name <app_service> --resource-group <resource_group_name>
  • Go to app service and copy the URL and open your application in the browser.
  • Deliverable: Take screenshot of the web page of the application containing message and image (Your web address should be visible).

Deliverables:

  1. Screenshots from Exercise 14.2, 14.3, 14.4 and 14.5
  2. Application code from 14.3 and 14.5
  3. Link to your messageboard deployed to App Services
  4. Link to the DockerHub image
14. Lab 14
Solutions for this task can no longer be submitted.

Possible solutions to potential issues

  • Institute of Computer Science
  • Faculty of Science and Technology
  • University of Tartu
In case of technical problems or questions write to:

Contact the course organizers with the organizational and course content questions.
The proprietary copyrights of educational materials belong to the University of Tartu. The use of educational materials is permitted for the purposes and under the conditions provided for in the copyright law for the free use of a work. When using educational materials, the user is obligated to give credit to the author of the educational materials.
The use of educational materials for other purposes is allowed only with the prior written consent of the University of Tartu.
Terms of use for the Courses environment