question

MohsenAkhavan avatar image
0 Votes"
MohsenAkhavan asked ·

How to convert a python program to a Azure function?

In my local, I created a time trigger function with python and every 5 minutes read a CSV file from blob storage and after processing, save it to another storage.
In local everything is okay. I "Deploy to Function APP..." from VS code to Azure but didn't work.
I wanna know how can I transfer this code to the Azure function?

 import datetime
 import logging
 import os
 import json
 import uuid
 import pandas as pd
 from azure.storage.blob import BlobServiceClient, BlobClient, ContainerClient, __version__
    
    
 import azure.functions as func
    
    
 def main(mytimer: func.TimerRequest) -> None:
     utc_timestamp = datetime.datetime.utcnow().replace(
         tzinfo=datetime.timezone.utc).isoformat()
    
     if mytimer.past_due:
         logging.info('The timer is past due!')
    
     logging.info('Python timer trigger function ran at %s', utc_timestamp)
    
     def clear_time(input_string):
         input_string = input_string.split(" ")
         date, time = input_string[0], input_string[1]
         time = time.split("-")[0].split(".")[0]
         return f"{date} {time}"
    
     def clean_iloc(input_line):
         temp = {}
         temp_array = []
         body = json.loads(input_line["body"])
         element_id = input_line["serial_id"]
         MSG_TYPE_TAG = body["MSG_TYPE_TAG"]
    
         # temp["serial_id"]=element_id
         # temp["message_type"]=MSG_TYPE_TAG
         temp_array.append(element_id)
         temp_array.append(MSG_TYPE_TAG)
         if body["error"] != {}:
             print(body["error"])
    
         if MSG_TYPE_TAG == "300":
             time = clear_time(
                 body["GET_RIGIDSENSE_SENSOR_ACCELDATA_LOG_TAG"]["date_time"])
             # temp["data_time"]=time
             temp_array.append(time)
             acceleration_array = body["GET_RIGIDSENSE_SENSOR_ACCELDATA_LOG_TAG"]["acceleration_array"]
             for i in range(100):
                 # temp[f"acceleration_element_{i}"]=acceleration_array[i]
                 temp_array.append(acceleration_array[i])
    
         else:
    
             time = clear_time(
                 body["GET_RIGIDSENSE_DEVICE_SHORTSTATUS_LOG_TAG"]["date_time"])
             # temp["data_time"]=time
             temp_array.append(time)
             for i in range(100):
                 # temp[f"acceleration_element_{i}"]=None
                 temp_array.append(None)
         return temp_array
    
     try:
    
         # Source Data Storage Variable
         urlblob = "******"
    
         # RAW Data Blob Storage Connection String
         raw_connect_str = "******"
    
         # Create the BlobServiceClient object which will be used to create a container client
         blob_service_client = BlobServiceClient.from_connection_string(
             raw_connect_str)
    
         # Create a unique name for the container
         container_name = str(uuid.uuid4())
    
         # Create the container
         container_client = blob_service_client.create_container(container_name)
         print("reading csv file...")
         df = pd.read_csv(urlblob)
         print("file read :D ")
    
         dataframe = pd.DataFrame({})
         dataframe["serial_id"] = []
         dataframe["message_type"] = []
         dataframe["data_time"] = []
         for i in range(100):
             dataframe[f"acceleration_element_{i}"] = []
         for i in range(df.shape[0]):
             dataframe.loc[i] = clean_iloc(df.iloc[i])
            
         # Create a blob client using the local file name as the name for the blob
         dataframe.to_csv("dataframe.csv")
         blob_client = blob_service_client.get_blob_client(
             container=container_name, blob='dataframe.csv')
         print("\nUploading to Azure Storage as blob:\t" + 'dataframe.csv')
         # Upload the created file
         with open("dataframe.csv", "rb") as data:
             blob_client.upload_blob(data)
    
     except Exception as ex:
         print('Exception:')
         print(ex)



azure-functions
10 |1000 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

MohsenAkhavan avatar image
0 Votes"
MohsenAkhavan answered ·

I found the solution. I should use a temporary path.

 # Create a blob client using the local file name as the name for the blob
 temp_path = tempfile.gettempdir()
 file_path = os.path.join(temp_path, 'dataframe.csv')
 dataframe.to_csv(file_path)



·
10 |1000 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

MayankBargali-MSFT avatar image
0 Votes"
MayankBargali-MSFT answered ·

Hi @MohsenAkhavan

You can follow publishing to azure to deploy your function app. Make sure you have publish application settings. Verify the logs in visual studio code whether your function app was deployed successfully or not.

Once the function app is deployed you can navigate to your timmer trigger function --> Monitor (Developer) --> Invocations.
Under Invocations, it will provide more details on whether your trigger was successfully triggered or not. Wait for some time and if there is no entry then you can verify function logs if there was an issue. I will also suggest you to verify the CRON expression.

Feel free to reach out to me if you are still facing the issue.

·
10 |1000 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.