Practice 14 - Application deployment in Azure Cloud Services
In this practice session, you will learn about application deployment on Azure Public Cloud services. The application can use multiple services such as object and database storage, web application hosting, and further scalability services. Azure provides a variety of services enabling the faster design and deployment of the applications. The aim of the lab is to design and deploy the message board with extended features (image upload, store and access) application using Azure services.
References
- Azure Cloud documentation: https://docs.microsoft.com/en-us/azure/?product=featured
- Azure Functions:https://azure.microsoft.com/en-us/services/functions/
- Azure Blob Storage: https://azure.microsoft.com/en-us/services/storage/blobs/
- Azure COSMOS DB : https://cosmos.azure.com/
Exercise 14.1 Introduction to application and setting up of development machine
In this practice session, we are going to re-use the message-board application code (messages stored locally in data.json) developed in Practice Session 2. We will extend the message board application by using different Azure services as mentioned below.
- Update the home.html with an option to upload images along with text messages and further store the image into the 'Azure Blob Storage service and finally, access and display the same image on the web page along with messages.
- Images uploaded by the end-users would be in different sizes, so design a resize (thumbnail) azure function to resize the image whenever uploaded into the blob storage. The resize function should be triggered when a new image arrived in the Azure Blob Storage. The output image of resizing function should be stored in the blob storage.
- Use the Azure COSMOS DB (No SQL database) to store the message data as a JSON document.
- Finally, create and deploy a message board application to 'Azure App Services.
We will use a VM as a development machine to develop the application and work with Azure tools and services. Launch an OpenStack VM, install azure CLI, and get ready with the existing message board application code.
- Use your last name as a part of the instance name!
- Source: Use Volume Snapshot, choose
Ubuntu 20 + Docker 20.10
- in this Ubuntu-based snapshot, the installation of Docker (as we did in Lab 2) has already been done for us.
- Enable "Delete Volume on Instance Delete"
- Flavour: should be
m2.tiny
- Install python virtual environment and Create a python virtual environment and activate.
- Install Azure CLI as mentioned in Lab 13 (Exercise 13.5)) and login to your azure account using command
az login
. - Clone the message board code
git clone https://bitbucket.org/jaks6/cloud-computing-2022-lab-1.git webapp
or you can reuse the code you submitted in the Practice 2.
Now, modify the application as per below instructions:
- Update
home.html
so that users can upload an an image along with their message. The image should end up being stored on the disk. (Make sure your name is displayed on the web page).- For example, you can follow the approach explained in this guide ( However, do not include the parts about using .env, "flash" and "flash messages" , they can cause conflicts when deploying to Azure later).
- Make sure you update
home.html
to display the saved image along with the text message.- Tip: Update the
append_message_to_file
function, so that when messages are stored to disk (data.json file), then the filepath of the image is also saved in addition to the message contents (call the new property 'img_path').
- Tip: Update the
- Update
After these updates, the application should work something like this:
- Test the application.
Exercise 14.2 Working with the Azure Blob Storage
In this task, we are going to work with the Azure Blob storage. Blob storage is optimized for storing massive amounts of unstructured data. Unstructured data is data that doesn't adhere to a particular data model or definition, such as text or binary data. Here, we are going to store the images uploaded by the end-user in azure blob storage.
- Add python SDK for azure blob storage
azure-storage-blob==12.3.1
inrequirements.txt
. - Let us create an Azure resource group, storage account, and containers to store the images.
- Use the guidelines from this manual and follow accordingly
- Create a resource group with the name
lab14
and location should benortheurope
- Create a storage account as your last name
<storage-account>
- Create a storage container with the name
raw
.az storage container create \ --account-name <storage-account> \ --name <container>
- Make the blobs available with public access. Please refer to the az command here
- Note down, the account key from your storage account using
az storage account keys list --account-name <STORAGE_ACCOUNT>
. Consider thekey1
part, we will use it as connection string in the next steps. - Set
STORAGE_ACCOUNT=<STORAGE_ACCOUNT>
,CONN_STRING=<CONN_STRING>
as an environment variables
- Create a resource group with the name
- Use the guidelines from this manual and follow accordingly
- Now, let us update
app.py
to use azure blobs- Import the Azure blob storage library
from azure.storage.blob import BlobServiceClient
- Read environment variables
connect_str= os.getenv('CONN_STRING')
, storage account namestorage_account = os.getenv('STORAGE_ACCOUNT')
and declare variableraw_images_container='raw'
- Create connection client
blob_service_client = BlobServiceClient(account_url="https://"+storage_account+".blob.core.windows.net/",credential=connect_str)
- Now, write a function
insert_blob
inapp.py
to upload the image into Azure blob.- The function receive input name as filename
insert_blob(img_path)
- Get the filename of the image
filename = (img_path).split('/')[-1]
- You can refer to this document for an example to upload an item to blob using the Azure blob python library.
- Create a blob client using the local file name as the name for the blob
blob_client = blob_service_client.get_blob_client(container=raw_images_container, blob=filename)
- Upload the image file to blob. (Refer # Upload the created file in the document link)
- Create a blob client using the local file name as the name for the blob
- The function receive input name as filename
- Update
home():
method- Now call the function
insert_blob(img_path)
with saved image path as argument, received from end user from the web page. - Further, let's change how the
append_message_to_file
gets called inhome()
. We will update theimg_path
value indata.json
to store the image Blob Storage URL instead of the local disk path. For this, you can change the 2nd argument ofappend_message_to_file
to:append_message_to_file(new_message, blob_url
- For blob_url, use the following value:
https://'+storage_account+'.blob.core.windows.net/'+raw_images_container+'/'+(img_path).split('/')[-1]
- For blob_url, use the following value:
- Now call the function
- Import the Azure blob storage library
- Install the requirements.txt packages
- Test the application.
- Deliverable: Take screenshot of the web page of application containing message and image (Your IP should be visible).
Exercise 14.3 Working with Blob Triggers and Azure functions
In this task, we will create a function to resize the image stored in the Azure blob. The Azure Functions can be triggered when new items are inserted into the azure blob. Azure functions support invocations based on the multiple types of triggers based on the events. Here, we write an azure function that is triggered by azure storage container insert event, to generate a thumbnail (resize) for the new image inserted into the blob, and then the output is stored again in one more container.
- Create a container with the name
processed
(also, make it public like in Task 14.2 ) to store the resized images. - Install Azure Function Core Tools command-line tools as mentioned in the document
- To make sure everything is set up correctly, let's look at what templates for Azure features are available. They can be used as a basis for creating functions
func templates list -l python
- Find the Azure function App on the Azure portal to create Function App (PS!! You can also create using command like
az functionapp create --consumption-plan-location northeurope --name <FUNCTION_APP_NAME> --os-type Linux --resource-group <RESOURCE_GROUP> --runtime python --runtime-version 3.8 --storage-account <STORAGE_ACCOUNT>
)- Create a new Azure Function App. It works as a container for the Azure feature.
- Set the Resource Group to the previous Resource Group that you used in the second task.
- FUNCTION_APP_NAME : Name it the name you chose, but we'll need it later ( FUNCTION_APP_NAME)
- RESOURCE_GROUP : Resource group you created in previous task
- STORAGE_ACCOUNT: Storage account name you created in the previous task
- Create a new Azure Function App. It works as a container for the Azure feature.
- Make directory in the development machine
mkdir ~/function && cd ~/function
- Create a new local Azure Features service project that uses the Python environment:
func init localhsproj --python
- Let's move to the created folder
cd localhsproj
- Create a function
func new --name FUNCTION_NAME --template "Azure Blob Storage trigger"
- NB! Replace
FUNCTION_NAME
with the name of your choice. - We choose the base type of the function:
Azure Blob Storage triggertemplate
- this means that this feature is designed to start when Azure Blob service events occur (such as when a new blog is uploaded). - As a result, a
FUNCTION_NAME
named folder is created
- NB! Replace
- Move to the
FUNCTION_NAME
directory, we will modify the contents of thefunction.json
file inbindings
:- Set
connection
to AzureWebJobsStorage- Through the AzureWebJobsStorage environment variable, our feature gains access to Azure blob storage connection information that can be used to create a new connection (such as creating or modifying a file)
- Update the
path
value fromsamples-workitems/{name}
to the location of our Azure Blog Storage container:raw/{name}
- It also configures the integration
raw
between the blob container and the function we create. This function is started for each new file.
- It also configures the integration
- Update
local.settings.json
, with value of AzureWebJobsStorage to"DefaultEndpointsProtocol=https;AccountName=<STORAGE_ACCOUNT>;AccountKey=<CONN_STRING>;EndpointSuffix=core.windows.net"
- Set
- Create a function
- Run the function app locally:
- Move to the Function App root folder ( localhsproj )
- Install all the Python libraries specified inside the requirements.txt file:
pip install -r requirements.txt
- Start the function service locally:
func host start
- Now, let us modify the function to resize the new image and store it in to
processed
blob container- Let's modify
requirements.txt
the file to add a Python library for handling images:- add a row
Pillow
for the library
- add a row
- Move to the function directory and edit the
function.json
file- This file defines where the function takes inputs and what its outputs are.
- Currently, only input is defined in the file (myblob, which is of type blobTrigger).
- This means that any objects appearing under the container path "raw/{name}" get passed to the function under the parameter name "myblob"
- Add one more element to the bindings list of the json:
{ "name": "blobout", "type": "blob", "direction": "out", "path": "processed/{name}", "connection": "AzureWebJobsStorage" }
- This tells Azure that any result written to "blobout" should be stored to the Blob Storage container path
"processed/{name}"
.
- We implement
__init__.py
the contents of the file- Import the required libraries:
from io import BytesIO
from PIL import Image
- Import the required libraries:
- Let's change
main()
the method entry to include an output object reference and a context object reference in the arguments.- It should now look like this:
main(myblob: func.InputStream, blobout: func.Out[func.InputStream]):
- It should now look like this:
- Let's change
main()
the content of the method:- Get the filename
filename = (myblob.name).split('/')[1]
- Read the input image
input_image = myblob
- Declare name for output image
output_image_name = "/tmp/"+filename
- Read the image
base_image = Image.open(input_image)
- Define size of thumbnail
new_size = 150,150
- Create thumbnail
base_image.thumbnail(new_size)
- Save thumbnail image
base_image.save(output_image_name, quality=95)
- Set the thumbnail to output blob
new_thumbfile = open(output_image_name,"rb")
andblobout.set(new_thumbfile.read())
- Get the filename
- Let's modify
- Now, let us test the function locally
- Make sure that your under directory
function/localhsproj
- Install the pip packages mentioned in the
requirements.txt
- launch the function
func host start
- If you find errors such as Exception: ModuleNotFoundError: No module named 'PIL'., than make sure that pip package Pillow is mentioned in
requirements.txt
.
- Make sure that your under directory
- Open your "raw" Container in Azure Portal, upload an image. You should see your locally running function react to it.
- Check the "processed" Container, the scaled image should appear there.
If the function works, publish the function to Azure cloud, so we don't need to run it locally anymore.
func azure functionapp publish <FUNCTION_APP_NAME> --build remote --python
- Replace
FUNCTION_APP_NAME
your function with the name of the application that we previously created in the Azure portal through the web interface.
- Move on to the directory
webapp
- To display the thumbnail (resize) images in the web pages, you have to update the
webapp/app.py
- Update
append_message_to_file
so that instead ofraw_images_container
the value "processed" is used.
- Update
- Finally, start the flask application
- If you display with error like TypeError: can only concatenate str (not "NoneType") to str, make sure that you have set the environment variables (CONN_STRING,STORAGE_ACCOUNT)
- Open the message-board application
- Add messages and images
- Now you should see the resized images on the web page
- Deliverable: Take screenshot of the web page of application containing message and image (Your IP should be visible).
Exercise 14.4 Working with COSMOS Database
Azure Cosmos DB is a fully managed NoSQL database service for modern app development. In this task, we are going to use COSMOS DB to store the messages in the database. We are going use COSMOS python SDK to interact with azure cosmos db.
We are modifying the message-board flask application to store the messages in the database instead of locally at data.json
. Here we modify app.py
to interact with cosmos db.
- Make sure that your under directory
webapp
- Add entry of
azure-cosmos
inrequirements.txt
- Don't forget to do
pip install
- Don't forget to do
- Create a COSMOS DB Service in Azure portal
- Search for Azure COSMOS DB Service in Azure portal
- Click on Create --> Core (SQL) - Recommended --> Create
- Set the Resource Group to the previous Resource Group you created
- Account Name: Name it a name of your choice.
- Region: North Europe
- Click on Review+Create (It takes 2-3 minutes to create) (PS!! If you experience an error message "Sorry, we are currently experiencing high demand in this region ..." on service creation, than choose the different Region for example Sweden Central )
- Now let us get the endpoint URL and Master key to access the COSMOS DB using python SDK
- Replace <RESOURCE_GROUP> and <ACCOUNT_NAME> in the command
export COSMOS_ACCOUNT_URI=$(az cosmosdb show --resource-group <RESOURCE_GROUP> --name <ACCOUNT_NAME> --query documentEndpoint --output tsv)
export MASTER_KEY=$(az cosmosdb keys list --resource-group <RESOURCE_GROUP> --name <ACCOUNT_NAME> --query primaryMasterKey --output tsv)
- Replace <RESOURCE_GROUP> and <ACCOUNT_NAME> in the command
- Now let us modify the
app.py
- Import the cosmos lib
import azure.cosmos.cosmos_client as cosmos_client
and
- Import the cosmos lib
from azure.cosmos.partition_key import PartitionKey
- Declare
HOST= os.getenv('COSMOS_ACCOUNT_URI')
- Declare Master Key
MASTER_KEY= os.getenv('MASTER_KEY')
- Declare the variables
- This is data base name
DATABASE_ID = "<FREELY_CHOOSE>"
- This is container
CONTAINER_ID = "<FREELY_CHOOSE>"
- This is data base name
- Create cosmos db client
client = cosmos_client.CosmosClient(HOST, {'masterKey': MASTER_KEY} )
- Create a data base
db = client.create_database_if_not_exists(id=DATABASE_ID)
- Create a container
container = db.create_container_if_not_exists(id=CONTAINER_ID, partition_key=PartitionKey(path='/id', kind='Hash'))
- Write a function
insert_cosmos
, which takescontent
andimg_path
as input it is as similar toappend_message_to_file
- Declare json dictionary
new_message
as like inappend_message_to_file
method with additional json attribute- With key
id
and value can be generated using uuid.uuid4() - The json document looks like:
- With key
- Declare json dictionary
- Declare
new_message = { 'id': str(uuid.uuid4()), 'content': content, 'img_path': img_path, 'timestamp': datetime.now().isoformat(" ", "seconds") }
- Insert a item in to the data base
try: container.create_item(body=new_message) except exceptions.CosmosResourceExistsError: print("Resource already exists, didn't insert message.")
- (The above also needs
import azure.cosmos.exceptions as exceptions
)
- (The above also needs
- Write a function to read the messages from the cosmos
read_cosmos
as similar toread_messages_from_file
- Declare list to store the messages
messages = []
- Read the items from cosmos
item_list = list(container.read_all_items(max_item_count=10))
- return the data
item_list
- Declare list to store the messages
- Now, update the home function of
app.py
- Replace
read_messages_from_file()
withread_cosmos
- Replace
append_message_to_file
withinsert_cosmos
- Update
render_template
-s 2nd argument to "data" instead of data["messages"]
- Replace
- Test the application
- Now build the docker image as performed in Practice 2.
- Push the docker image to docker hub with image name
lab14.v1
- Run the container with environment variables (CONN_STRING, STORAGE_ACCOUNT, COSMOS_ACCOUNT_URI, MASTER_KEY)
- Test the application by inserting the messages and images
- Deliverable: Take screenshot of the web page of the application containing message and image (Your IP should be visible).
Exercise 14.5 Deploying application in Azure App Service
Azure App Service is a fully managed platform for building web applications. In this staks, we are going to deploy the message-boad application in to app service.
- Search for App Services in the azure portal.
- Click on Create app service
- Choose Resource Group to your resource group created in the earlier task.
- Choose Name freely
- Select Publish --> Docker
- Region --> North Europe
- Click on Next:Docker>
- Image Source --> Docker Hub
- Image and tag --> Provide your docker hub image pushed in earlier task
- Now, it will take few minutes to deploy application
- After this, add environment variables (required for docker container run) as mentioned in the document
- Here add these variables (STORAGE_ACCOUNT, COSMOS_ACCOUNT_URI) in Application settings and (CONN_STRING,MASTER_KEY) in Connection Strings
- You need to modify
app.py
to access these variables. You should prefix with APPSETTING_ for appl;ication setting variables. For example APPSETTING_COSMOS_ACCOUNT_URI and for the connection string, should prefix with CUSTOMCONNSTR_. - Build and push the image to docker hub after modifications.
- Restart the app service. (After restarting app service, newly updated docker image will be downloaded automatically)
- You can enable the logs as mentioned here
- Get the logs using in your terminal
az webapp log tail --name <app_service> --resource-group <resource_group_name>
- Go to app service and copy the URL and open your application in the browser.
- Deliverable: Take screenshot of the web page of the application containing message and image (Your web address should be visible).
Deliverables:
- Screenshots from Exercise 14.2, 14.3, 14.4 and 14.5
- Application code from 14.3 and 14.5
- Link to your messageboard deployed to App Services
- Link to the DockerHub image