Integrating Vision with Oracle Analytics Cloud (OAC)

Create a Data Integration flow that uses the Vision SDK to detect objects in images and project that information into a table in a data warehouse. This output data is then used by Oracle Analytics Cloud to create visualizations and find patterns.

This is the high-level flow of the system:
Figure 1. High level flow between Vision and OAC
The flow between Vision and OAC, starting with the input data asset as a CSV file into Data Integration. This is processed and passed out as the output data into an ADW objects table which is then used as a source for OAC.

Before You Begin

To follow this tutorial, you must be able to create VCN networks, functions, and API gateways, and use Data Integration and Vision.

Talk to your administrator about the policies required.

Setting Up the Required Policies

Follow these steps to set up the required policies.

  1. In the Console navigation menu, click Identity & Security.
  2. Under Identity, click Policies.
  3. Click Create Policy.
  4. In the Create Policy panel, poplulate Name and Description.
    For Name, enter a name without any spaces. You can use alphanumeric characters, hyphens, periods, and underscores only.

    For Description, enter a description to help other users know the purpose of this set of policies.

  5. In Policy Builder, use the manual editor to add the following statements:
    allow group <group-name> to use cloud-shell in tenancy
    allow group <group-name> to inspect all-resources in tenancy
    allow group <group-name> to read instances in tenancy
    allow group <group-name> to read audit-events in tenancy
    allow group <group-name> to manage dis-workspaces in compartment <compartment-name>
    allow group <group-name> to manage dis-work-requests in compartment <compartment-name>
    allow group <group-name> to manage virtual-network-family in compartment <compartment-name>
    allow group <group-name> to manage tag-namespaces in compartment <compartment-name>
    allow service dataintegration to use virtual-network-family in compartment <compartment-name>
    allow group <group-name> to manage object-family in compartment <compartment-name>
    allow group <group-name> to manage functions-family in compartment <compartment-name>
    allow group <group-name> to manage api-gateway-family in compartment <compartment-name>
    allow group <group-name> to inspect instance-family in compartment <compartment-name>
    allow group <group-name> to manage autonomous-database-family in compartment <compartment-name>
    allow group <group-name> to use analytics-instances in compartment <compartment-name>
    allow group <group-name> to manage repos in tenancy
    allow group <group-name> to read objectstorage-namespaces in tenancy
    allow group <group-name> to manage logging-family in compartment <compartment-name>
    allow group <group-name> to read metrics in compartment <compartment-name>
    allow group <group-name> to use apm-domains in compartment <compartment-name>
    allow service faas to use apm-domains in compartment <compartment-name>
    allow group <group-name> to use ai-service-vision-family in compartment <compartment-name>
  6. Click Create.

1. Create a Virtual Cloud Network

Create a VCN to serve as the home for the serverless function and the API gateway created later in the tutorial.

1.1 Creating a VCN with Internet Access

Follow these steps to create a VCN with internet access.

  1. In the navgiation menu, click Networking.
  2. Click Virtual Cloud Networks.
  3. Click Start VCN Wizard.
  4. Select Create VCN with Internet Connectivity.
  5. Click Start VCN Wizard.
  6. Enter a name for the VCN. Avoid entering confidential information.
  7. Click Next.
  8. Click Create.

1.2 Accessing your VCN from the Internet

You must add a new stateful ingress rule for the public regional subnet to allow traffic on port 443.

Complete 1.1 Creating a VCN with Internet Access before trying this task.

The API Gateway communicates on port 443, which isn't open by default.

  1. In the navgiation menu, click Networking.
  2. Click Virtual Cloud Networks.
  3. Select the VCN you created in 1.1 Creating a VCN with Internet Access.
  4. Click the name of the public regional subnet.
  5. Under Security Lists, select Default Security List.
  6. Click Add Ingress Rules.
  7. Update the following fields with the values given:
    • Source Type: CIDR
    • Source CIDR: 0.0.0.0/0
    • IP Protocol: TCP
    • Source Port Range: All
    • Destination Port Range: 443
  8. Click Add Ingress Rules to add the new rule to the default security list.

2. Creating an API Gateway

An API Gateway lets you to aggregate all the functions you created into a single end-point that can be consumed by your users.

Complete 1. Create a Virtual Cloud Network before trying this task.

  1. From the Console navigation menu, click Developer Services.
  2. Click Gateways.
  3. Click Create Gateway.
  4. Enter a Name for the gateway. Avoid entering confidential information.
  5. Set the Type of the gateway to Public.
  6. Select the Compartment to create the API Gateway resources in.
  7. Select the name of the VCN to use with the API Gateway. Use the name of the VCN you created in section 1. Create a Virtual Cloud Network.
  8. Select the name of the regional subnet in the VCN. Set it to the Public Subnet you just changed.
  9. Click Create Gateway.

    When the API Gateway is created, it's shown as Active in the list on the Gateways page.

3. Create an Enrichment Function

Follow these steps to create an enrichment function than can be called from Oracle Cloud Infrastructure Data Integration.

Create a serverless function that only runs on demand. The function conforms to the schema required to be consumed by Data Integration. The serverless function calls Vision's API through Python SDK.

3.1 Creating an Application

To add a function, first we need to create an application.

Complete 2. Creating an API Gateway before trying this task.

  1. From the Console, navigation menu, click Developer Services.
  2. Click Applications.
  3. Click Create Application.

    You can think of an application as a bounded context where several functions can reside.

  4. Enter a Name. Avoid entering confidential information.
  5. Select the VCN created in section 1. Create a Virtual Cloud Network.
  6. Select the VCN's public subnet
  7. Click Create.
  8. When the application is created, open it and click Getting Started.

    So that it can deploy functions to the correct compartment and container registry, set up CLI.

  9. Click Cloud Shell Setup.
  10. Click Launch Cloud Shell.
    This starts a Linux virtual machine with all the configurations to set up functions.
  11. Follow steps 1 to 7 in Setup fn CLI on Cloud Shell, or review the video Create Serverless Functions on Oracle Cloud.
    Note

    If step 4 isn't clear, see Creating an Fn Project CLI Context to Connect to Oracle Cloud Infrastructure in the Functions documentation. You can select any term for OCIR-REPO, it's a prefix that's used as the name of the container registry to deploy the function.

3.2 Creating a Function

Follow these steps to create a function in your application.

Complete 3.1 Creating an Application before trying this task.

The fastest way is to have the system generate a Python template.

  1. Run the following command in the cloud shell:
    fn init --runtime python object-detection
    cd object-detection
    It generates three files:
    • func.yaml
    • requirements.txt
    • func.py
  2. Change each of the three files with the following content:
Func.yaml

Recommended content for func.yaml.

schema_version: 20180708
name: object-detection
version: 0.0.1
runtime: python
build_image: fnproject/python:3.8-dev
run_image: fnproject/python:3.8
entrypoint: /python/bin/fdk /function/func.py handler
memory: 256
timeout: 300
Requirements.txt

Recommended content for requirements.txt.

fdk>=0.1.40
oci
https://objectstorage.us-ashburn-1.oraclecloud.com/n/axhheqi2ofpb/b/vision-oac/o/vision_service_python_client-0.3.9-py2.py3-none-any.whl
pandas
requests
Func.py

Recommended content for func.py.

import io
import json
import logging
import pandas
import requests
import base64
from io import StringIO
from fdk import response
 
import oci
from vision_service_python_client.ai_service_vision_client import AIServiceVisionClient
from vision_service_python_client.models.analyze_image_details import AnalyzeImageDetails
from vision_service_python_client.models.image_object_detection_feature import ImageObjectDetectionFeature
from vision_service_python_client.models.inline_image_details import InlineImageDetails
 
def handler(ctx, data: io.BytesIO=None):
    signer = oci.auth.signers.get_resource_principals_signer()
    resp = do(signer,data)
    return response.Response(
        ctx, response_data=resp,
        headers={"Content-Type": "application/json"}
    )
 
def vision(dip, txt):
    encoded_string = base64.b64encode(requests.get(txt).content)
 
    image_object_detection_feature = ImageObjectDetectionFeature()
    image_object_detection_feature.max_results = 5
    features = [image_object_detection_feature]
    analyze_image_details = AnalyzeImageDetails()
    inline_image_details = InlineImageDetails()
    inline_image_details.data = encoded_string.decode('utf-8')
    analyze_image_details.image = inline_image_details
    analyze_image_details.features = features
    try:
        le = dip.analyze_image(analyze_image_details=analyze_image_details)
    except Exception as e:
        print(e)
        return ""
    if le.data.image_objects is not None:
      return json.loads(le.data.image_objects.__repr__())
    return ""
 
 
def do(signer, data):
    dip = AIServiceVisionClient(config={}, signer=signer)
 
    body = json.loads(data.getvalue())
    input_parameters = body.get("parameters")
    col = input_parameters.get("column")
    input_data = base64.b64decode(body.get("data")).decode()
    df = pandas.read_json(StringIO(input_data), lines=True)
    df['enr'] = df.apply(lambda row : vision(dip,row[col]), axis = 1)
    #Explode the array of aspects into row per entity
    dfe = df.explode('enr',True)
    #Add a column for each property we want to return from imageObjects struct
    ret=pandas.concat([dfe,pandas.DataFrame((d for idx, d in dfe['enr'].iteritems()))], axis=1)
 
 
    #Drop array of aspects column
    ret = ret.drop(['enr'],axis=1)
    #Drop the input text column we don't need to return that (there may be other columns there)
    ret = ret.drop([col],axis=1)
    if 'name' not in ret.columns:
        return pandas.DataFrame(columns=['id','name','confidence','x0','y0','x1','y1','x2','y2','x3','y3']).to_json(orient='records')
    for i in range(4):
        ret['x' + str(i)] = ret.apply(lambda row: row['bounding_polygon']['normalized_vertices'][i]['x'], axis=1)
        ret['y' + str(i)] = ret.apply(lambda row: row['bounding_polygon']['normalized_vertices'][i]['y'], axis=1)
    ret = ret.drop(['bounding_polygon'],axis=1)
 
    rstr=ret.to_json(orient='records')
    return rstr

3.3 Deploying the Function

Deploy the function to your application

Complete 3.2 Creating a Function before trying this task.

  1. Run the following cloud shell command:
    fn -v deploy app <app_name>
  2. Confirm that the function is registered in your container registry.
    1. From the Console navigation menu, click Developer Services.
    2. Click Container Registry. The function is visible in the container registry.

3.4 Invoking the Function

Test the function by calling it.

Complete 3.3 Deploying the Function before trying this task.

Oracle Cloud Infrastructure Data Integration supports calling functions, where the data payload is a single base 64 encoded string that contains the records to process and a set of parameters. For example:
{"data":"eyJpZCI6MSwiaW5wdXRUZXh0IjoiaHR0cHM6Ly9pbWFnZS5jbmJjZm0uY29tL2FwaS92MS9pbWFnZS8xMDYxOTYxNzktMTU3MTc2MjczNzc5MnJ0czJycmRlLmpwZyJ9", "parameters":{"column":"inputText"}}
The encoded data is the base 64 encoded version of a set of JSON Lines format (each line is a JSON for each record). Each record has an ID that's used to associate the output. Decoding the example string gives:
{"id":1,"inputText":"https://<server-name>/api/v1/image/106196179-1571762737792rts2rrde.jpg"}
Test the function with the following command:
echo '{"data":"<data-payload>", "parameters":{"column":"inputText"}}' | fn invoke <application-name> object-detection
The output looks similar to:
[{"id":1,"confidence":0.98330873,"name":"Traffic Light","x0":0.0115499255,"y0":0.4916201117,"x1":0.1609538003,"y1":0.4916201117,"x2":0.1609538003,"y2":0.9927374302,"x3":0.0115499255,"y3":0.9927374302},{"id":1,"confidence":0.96953976,"name":"Traffic Light","x0":0.8684798808,"y0":0.1452513966,"x1":1.0,"y1":0.1452513966,"x2":1.0,"y2":0.694972067,"x3":0.8684798808,"y3":0.694972067},{"id":1,"confidence":0.90388376,"name":"Traffic sign","x0":0.4862146051,"y0":0.4122905028,"x1":0.8815201192,"y1":0.4122905028,"x2":0.8815201192,"y2":0.7731843575,"x3":0.4862146051,"y3":0.7731843575},{"id":1,"confidence":0.8278353,"name":"Traffic sign","x0":0.2436661699,"y0":0.5206703911,"x1":0.4225037258,"y1":0.5206703911,"x2":0.4225037258,"y2":0.9184357542,"x3":0.2436661699,"y3":0.9184357542},{"id":1,"confidence":0.73488903,"name":"Window","x0":0.8431445604,"y0":0.730726257,"x1":0.9992548435,"y1":0.730726257,"x2":0.9992548435,"y2":0.9893854749,"x3":0.8431445604,"y3":0.9893854749}]

4. Adding a Functions Policy

Create a policy so that the function can be used with Vision.

Complete 3. Create an Enrichment Function before trying this task.

  1. From the Console navigation menu, click Identity & Security.
  2. Click Dynamic Groups.
  3. Create a dynamic group with the following rule:
    ALL {resource.type = 'fnfunc', resource.compartment.id = '<compartment-id>'}
  4. Add the following statements to the policy:
    allow any-user to use functions-family in compartment <compartment-name> where ALL {request.principal.type= 'ApiGateway', request.resource.compartment.id = '<compartment-id>'}
    allow dynamic-group <dynamic-group-name> to use ai-service-vision-family in tenancy

5. Creating an Oracle Cloud Infrastructure Data Integration Workspace

Before you can use Data Integration, ensure you have the rights to use the capability.

Complete 4. Adding a Functions Policy before trying this task.

Create the policies that let you to use Data Integration.

  1. From the Console navigation menu, click Analytics & AI.
  2. Click Data Integration.
  3. Select the compartment for your workspace.
  4. Click Create Workspace.
  5. Give the workspace a Name. Avoid entering confidential information.
  6. Ensure Enable private network is selected.
  7. Select a VCN in the compartment.
  8. Ensure that the subnet you select is your private subnet.
  9. Click Create.
    The workspace takes a few minutes to create.
  10. When the workspace is created, confirm it's in the Active state.

6. Adding Data Integration Policies

Update your policy so you can use Data Integration.

Complete 5. Creating an Oracle Cloud Infrastructure Data Integration Workspace before trying this task.

  1. Follow the steps in Setting Up the Required Policies to open the Policy Builder.
  2. Add the following statements to the policy:
    allow any-user to read buckets in compartment <compartment-name> where ALL {request.principal.type = 'disworkspace', request.principal.id = '<data-integration-workspace-ocid>', request.operation = 'GetBucket'}
    allow any-user to manage objects in compartment <compartment-name> where ALL {request.principal.type = 'disworkspace', request.principal.id = '<data-integration-workspace-ocid>'}
    allow any-user to manage buckets in compartment <compartment-name> where ALL {request.principal.type = 'disworkspace', request.principal.id = '<data-integration-workspace-ocid>', request.permission = 'PAR_MANAGE'}
    allow any-user {PAR_MANAGE} in compartment <compartment-name> where ALL {request.principal.type='disworkspace', request.principal.id='<data-integration-workspace-ocid>'}
    allow any-user to use functions-family in compartment <compartment-name> where ALL {request.principal.type = 'disworkspace', request.principal.id='<data-integration-workspace-ocid>'}

7. Prepare the Data Sources and Sinks

You are using car parking images along with the date the images were taken as sample data.

Gather 10 images (or more) of parked cars as the data source on which you perform object detection analysis using Data Integration and Vision.

7.1 Loading Sample Data

Load the parked car images sample data to your bucket.

Complete 6. Adding Data Integration Policies before trying this task.

  1. Find 10 images of parked cars, either locally or online.
  2. In the Console navigation menu, click Storage.
  3. Click Buckets.
  4. Select an existing bucket, or create a new one.
  5. On the Buckets details page, under Objects, click Upload.
  6. Drag to the drop zone the 10 image files you gathered in step 1.
  7. Click Upload.
  8. Create a CSV file with a table of four columns and 10 rows. The column names are Record ID, Image Name, Date Taken, and Image Location. Fill in the Record ID column from 1 to 10.
    Figure 2. Sample Data File
    Table of four columns and ten rows. The column names are Record ID, Image Name, Date Taken, and Image Location.
  9. Name the file cars.csv.
  10. Fill in the table by providing the image names, date taken, and image location.
    The image location can be found by selecting the Actions menu (Actions Menu) for the image in the Console when viewing the bucket. Select View Object Details and copy the URL path to cars.csv.
  11. Upload cars.csv to the bucket.
  12. Click Close.

7.2 Creating a Staging Bucket

Data Integration needs a staging location to dump intermediate files in, before publishing data to a data warehouse.

Complete 7.1 Loading Sample Data before trying this task.

  1. In the Console navigation menu, click Storage.
  2. Click Buckets.
  3. Click Create Bucket.
  4. Give it a suitable Name, for example, data-staging. Avoid entering confidential information.
  5. Click Create.
  6. Accept all default values.

7.3 Preparing the Target Database

Configure your target Autonomous Data Warehouse database to add a schema and a table.

Complete 7.2 Creating a Staging Bucket before trying this task.

  1. In the Console navigation menu, Click Oracle Database.
  2. Click Autonomous Data Warehouse.
  3. Select your Compartment.
  4. Click Create Autonomous Database.
  5. Enter a Display Name. Avoid entering confidential information.
  6. Enter a Database Name. Avoid entering confidential information.
  7. Set Workload type to Data warehouse.
  8. Create the username and password for the database's adminsitrator.
  9. Set Access type to Secure access from anywhere.
  10. Set Authentication to mTLS.
  11. Set License type to BYOL.
  12. Click Create Autonomous Database.
  13. When your database has provisioned, on the Database details page, click Database Actions.
  14. Log in with the credentials you supplied in step 8.
  15. Click Development.
  16. Click SQL.
  17. Create a Contributor user by running the following SQL:
    CREATE USER USER1 IDENTIFIED BY "<enter user1 password here>";
    GRANT DWROLE TO USER1;
    ALTER USER USER1 QUOTA 200M ON DATA;

    Autonomous Databases come with a predefined database role called DWROLE. It provides the common privileges for a database developer or data scientist to perform real-time analytics. Depending on the usage requirements you might also need to grant privileges to other users.

7.4 Creating a Table to Project the Analyzed Data

Create a table to store any information about the detected objects.

Complete 7.3 Preparing the Target Database before trying this task.

  1. Navigate to the Database Actions dashboard if you're not already there.
  2. Run the following script:
    CREATE TABLE USER1.OBJECTS
       ("RECORD_ID" INT,
        "IMAGE_NAME" VARCHAR2(200 BYTE),
        "DATE_TAKEN" DATE,
        "IMAGE_LOCATION" VARCHAR2(2000 BYTE),
        "OBJECT_NAME" VARCHAR2(200 BYTE),
        "OBJECT_CONFIDENCE" DECIMAL(8,7),
        "VERTEX_X1" FLOAT,
        "VERTEX_Y1" FLOAT,
        "VERTEX_X2" FLOAT,
        "VERTEX_Y2" FLOAT,
        "VERTEX_X3" FLOAT,
        "VERTEX_Y3" FLOAT,
        "VERTEX_X4" FLOAT,
        "VERTEX_Y4" FLOAT
    ) SEGMENT CREATION IMMEDIATE
      PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255
     NOCOMPRESS LOGGING
      STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
      PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1
      BUFFER_POOL DEFAULT FLASH_CACHE DEFAULT CELL_FLASH_CACHE DEFAULT)
      TABLESPACE "VISION"

8. Use a Data Flow in Data Integration

Create the components necessary to create a data flow in Data Integration.

The data flow is:
Figure 3. The Data Flow
The flow starts with cars.csv file, an expression and object detection function work on it and the output is sent to the target object storage.

All the underlying storage resources were created in earlier chapters. In Data Integration, you create the data assets for each of the elements of the data flow.

8.1 Creating a Data Asset for your Source and Staging

Create a data asset for your source and staging data.

Complete 7. Prepare the Data Sources and Sinks before trying this task.

  1. From the Console navigation menu, click Analytics & AI.
  2. Click Data Integration.
  3. On the home page of the workspace you created in 5. Creating an Oracle Cloud Infrastructure Data Integration Workspace, click Create Data Asset on the Quick Actions tile.
  4. On the Create Data Asset page, populate the following fields:
    1. For Name, enter cars-data-source.
    2. For Description, enter some text which helps you, or other users, find the data asset.
    3. Select Oracle Object Storage as the Type.
    4. Enter the Tenant OCID. If you need to find your tenancy information, it's available if you click the Console Profile icon.
    5. Specify the OCI Region.
  5. Click Test Connection.
  6. Click Create.
  7. (Optional) If the location of your staging bucket is different, repeat steps 3 - 6 to create a data asset for staging.
    Note

    The name of the staging location must be capitalized.

8.2 Creating a Data Asset for your Target

Create a data asset for the target data warehouse.

Complete 8.1 Creating a Data Asset for your Source and Staging before trying this task.

  1. From the Console navigation menu, click Analytics & AI.
  2. Click Data Integration.
  3. On the home page of the workspace you created in 5. Creating an Oracle Cloud Infrastructure Data Integration Workspace, click Create Data Asset on the Quick Actions tile.
  4. On the Create Data Asset page, populate the following fields:
    1. For Name, enter data-warehouse.
    2. Although the Identifier is auto-generated from the Name, you can change the value.
    3. For Description, enter some text which helps you, or other users, find the data asset.
    4. Select Oracle Autonomous Data Warehouse as the Type.
    5. Click Select Database.
    6. Enter the Tenant OCID. If you need to find your tenancy information, it's available if you click the Console Profile icon.
    7. Select the Compartment.
    8. Select the ADW you created in 7.3 Preparing the Target Database.
    9. Specify the TNS Level. Set it to whatever value is appropriate.
    10. Select the Service Name to connect to your ADW.
  5. In the Connection section, add the following information:
    1. For Name, select the default value, although you can rename it if you want to.
    2. For Description, enter some text which helps you, or other users, find the data asset.
    3. Set User Name to USER1.
    4. Enter the Password for the user.
  6. Click Test Connection to verify the credentials that you just entered .
  7. If step 6 is successful, click Create.

8.3 Creating a Data Flow

Create a data flow in Data Integration to ingest the data from file.

Complete 8.2 Creating a Data Asset for your Target before trying this task.

  1. On the vision-lab project details page, click Data Flows.
  2. Click Create Data Flow.
  3. In the data flow designer, click the Properties panel.
  4. For Name, enter lab-data-flow.
  5. Click Create.

8.4 Adding a Data Source

Now add a data source to your data flow.

Complete 8.3 Creating a Data Flow before trying this task.

Having created the data flow in 8.3 Creating a Data Flow, the designer remains open and you can add a data source to it using the following steps:

  1. Drag the source icon into the data flow workspace area.
  2. Select the source.
  3. In Properties, select the Details tab.
  4. Update the properties as follows:
    1. In Identifier, enter CARS_CSV.
    2. In Details, select source data asset you created in 8.1 Creating a Data Asset for your Source and Staging.
    3. Set Connection to the default.
    4. In Schema, select the bucket that contains the cars.csv data file.
    5. In Data entity, select the cars.csv data file.
    6. Set File type to CSV.
  5. Navigate to Data. The data appears there after a minute or two.

8.5 Adding an Expression

Add an expression to change the format of the ID ot integer, and the format of the date_taken field to a date.

Complete 8.4 Adding a Data Source before trying this task.

  1. Right-click the action menu icon for the <data source name>.id field.
  2. Select Change Data Type.
  3. Enter ID.
  4. For Data Type, select integer.
  5. Click Apply.
    A new expression step is created in yout data flow.
  6. Right-click the action menu icon for the <data source name>.date_taken field.
  7. Select Change Data Type.
  8. For Data Type, select Date.
  9. Ensure the date formatting matches what is in the CSV file (yyyy-MM-dd).
  10. Set Name to DATE_TAKEN.
  11. Click the Data tab for the expression to see the new fields.
    Figure 4. Data Fields
    The four data fields for the expression with populated cells.

8.6 Adding a Function

Add a function to the Data Flow to extract objects from the input images.

Complete 8.5 Adding an Expression before trying this task.

  1. From the operators toolbar, drag the Function (fn) operator onto the canvas.
  2. Connect the output of your expression as the input to the function.
  3. Click the function.
  4. In the Properties pane, navigate to Details.
  5. Change the identifier to OBJECT_DETECTION.
  6. Click Select to select a function.
  7. Select the application you created in 3.1 Creating an Application.
  8. Select the object-detection function.
  9. Click OK to confirm your changes.
  10. Add, or edit, the properties. Click Add Property to add a property. Use the following values:
    Function Attributes
    Name Type Data Type Length Scale Value
    data Input attribute VARCHAR 2000
    column Function Configuration VARCHAR data
    BATCH_SIZE Function Configuration NUMERIC/VARCHAR (default) 1
    name Output attribute VARCHAR 200
    confidence Output attribute DECIMAL 8 7
    x0 Output attribute FLOAT 64
    y0 Output attribute FLOAT 64
    x1 Output attribute FLOAT 64
    y1 Output attribute FLOAT 64
    x2 Output attribute FLOAT 64
    y2 Output attribute FLOAT 64
    x3 Output attribute FLOAT 64
    y3 Output attribute FLOAT 64

8.7 Mapping the Output to the Data Warehouse Table

Map the output of the sentiment analysis to the Data Warehouse Table.

Complete 8.6 Adding a Function before trying this task.

  1. Navigate to the Map tab.
  2. Drag image_location in the source attributes table into the data field function input.
  3. From the operator's toolbar, drag the Target operator into the canvas.
  4. Connect the output of your object detection function to the input of the target operator.
  5. In the details properties tab for the target, set the following fields to the given values:
    Identifier
    TARGET_OBJECT_DETECTION
    Integration Strategy
    Insert
    Data Asset
    Select the data warehouse asset you created in 8.2 Creating a Data Asset for your Target.
    Connection
    Default connection
    Schema
    USER1
    Data Entity
    OBJECTS
    Staging Location
    Provide an object storage location where the intermediate files can be created when the data flow runs:
    Data Asset
    cars-data-source
    Connection
    Default connection
    In Schema
    Select data-staging object storage location
  6. Map the output of the function to the correct fields in the target database table. Use the mappings in the following table:
    Function Output Map
    Name Mapping
    RECORD_ID RECORD_ID
    IMAGE_NAME Image_Name
    DATE_TAKEN DATE_TAKEN
    IMAGE_LOCATION Image_Location
    OBJECT_NAME name
    OBJECT_CONFIDENCE confidence
    VERTEX X1 x0
    VERTEX Y1 y0
    VERTEX X2 x1
    VERTEX Y2 y1
    VERTEX X3 x2
    VERTEX Y3 y2
    VERTEX X4 x3
    VERTEX Y4 y3
    The mappings should look like:
    Figure 5. Mappings One to Four
    The first four mappings shown in the application
    Figure 6. Mappings Five to Eight
    The second four mappings shown in the application
    Figure 7. Mappings Nine to Twelve
    The third four mappings shown in the application
    Figure 8. Mappings Thirteen and Fourteen
    The final two mappings shown in the application

8.8 Running the Data Flow

Run the data flow to populate the target database.

Complete 8.7 Mapping the Output to the Data Warehouse Table before trying this task.

  1. In your workspace Quick Actions menu (Actions Menu), click Create Integration Task.

    As part of the creation process, select the project and the data flow you created in 8.3 Creating a Data Flow.

  2. In the workspace, select Applications.
  3. Click Create Application.
  4. Enter a Name.
  5. Click Create.
  6. In the workspace, click Projects.
  7. Select the project you created in 8.3 Creating a Data Flow.
  8. In the Details menu, click Tasks.
  9. For the task you created in step 1, click Run from the action icon menu.
  10. Click Publish to Application.
  11. Select the application you created in step 3.
  12. In the application, select the integration task.
  13. Click Run from the action icon menu.
    You can follow progress of the run from the Runs page. If there are any errors, check the logs to help understand why.
  14. When the run finishes successfully, look in the database to see if the tables are populated correctly. Run the following SQL:
    SELECT * FROM USER1.OBJECTS;

9. Visualize the Data in Analytics Cloud

Display the data you created using Analytics Cloud.

You need access to Analytics Cloud and be able to create an Analytics Cloud instance.

9.1 Creating an Analytics Cloud Instance

Follow these steps to create an Analytics Cloud instance.

Complete 8. Use a Data Flow in Data Integration before trying this task.

  1. From the Console navigation menu, click Analytics & AI.
  2. Click Analytics Cloud.
  3. Select yout Compartment,
  4. Click Create Instance.
  5. Enter a Name. Avoid entering confidential information
  6. Select 2 OCPUs. The other configuration parameters keep as the default values.
  7. Click Create.

9.2 Creating a Connection to the Data Warehouse

Follow these steps to set up a connection from your Analytics Cloud instance to your data warehouse.

Complete 9.1 Creating an Analytics Cloud Instance before trying this task.

  1. In the details page, click Analytics Home Page. Sign in to your Analytics Cloud instance.
  2. Click Create Dataset.
  3. Click Create Connection.
  4. Select Oracle Autonomous Data Warehouse.
  5. Enter the log in credentials for the target database you created in 7.3 Preparing the Target Database.
    If you can't remember how to get the wallet, see how to Download a Wallet for more information.

9.3 Creating a Dataset

Follow these steps to create a dataset.

Complete 9.2 Creating a Connection to the Data Warehouse before trying this task.

  1. Click Data.
  2. Click Create.
  3. Click Create a New Dataset.
  4. Select your data warehouse.
  5. From USER1 database, drag the OBJECTS table onto the canvas.
  6. Save your dataset.

9.4 Creating a Visualization

Follow these steps to display your data in Analytics Cloud.

Complete 9.3 Creating a Dataset before trying this task.

  1. Click Create.
  2. Click Workbook.
  3. Select the dataset you created in 9.3 Creating a Dataset.
  4. Click Add to Workbook.
  5. Click the Visualizations tab.
  6. Drag a bar visualization onto the canvas.
  7. Go to the data.
  8. Right click My calculations table.
  9. Select Add Calculation....
  10. Set Name to COUNT OF OBJECTS.
  11. Enter COUNT(OBJECT_NAME) in Function.
  12. Drag COUNT OF SENTIMENT on the Value (Y-axis) of the visualization.
  13. Select OBJECTS > DATE_TAKEN for the Category (X-axis) of the visualization.
  14. Click DATE_TAKEN.
  15. Select Show by...Day.
  16. For Color, select OBJECT_NAME.
    You see a graph similar to:
    Figure 9. Bar Graph Visualization
    Bar graph showing the number of car and wheels detected for each day.