cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
jdarling20
Helper I
Helper I

Regex and MatchAll to parse JSON without Flow

I am trying to parse a JSON string which will contain upwards of 14 columns per object, but may contain fewer as some columns may be blank or null. Here is a section of my JSON (shown with IndentFour for visual purposes only, it is normally set to Compact):

 

[
    {
        "Title": "05/13/2022 00:43:53",
        "actionDate": "2021-12-03",
        "bandwidth_delta": 215,
        "floor_exit_date": "2021-12-31",
        "locationId": 108,
        "max_seats_actual": 275,
        "max_seats_leveraged": 275,
        "projectId": 2,
        "running_container_w_hc": 60,
        "sum_container_hc": 166,
        "sum_container_w_hc": -166,
        "unique_key": "20211203_108"
    },
    {
        "Title": "05/13/2022 00:43:53",
        "actionDate": "2022-01-28",
        "bandwidth_delta": 19,
        "floor_exit_date": "2024-02-28",
        "locationId": 134,
        "max_seats_actual": 68,
        "max_seats_leveraged": 68,
        "projectId": 2,
        "running_container_w_hc": 49,
        "sum_container_hc": 1,
        "sum_container_w_hc": -1,
        "unique_key": "20220128_134"
    },
    {
        "Title": "05/13/2022 00:43:53",
        "actionDate": "2021-11-01",
        "bandwidth_delta": 194,
        "floor_exit_date": "2021-12-31",
        "locationId": 107,
        "max_seats_actual": 202,
        "max_seats_leveraged": 202,
        "projectId": 2,
        "running_container_w_hc": 8,
        "sum_container_hc": 176,
        "sum_container_w_hc": -176,
        "unique_key": "20211101_107"
    }
]

 

Here is the code the I use to get this from JSON into my collection:

 

ClearCollect(
    colActionItemAnalysis,
    MatchAll(
        Substitute(
            gblActionItemAnalysisJSON,
            Char(10),
            ""
        ),
        "\{(\s*""Title"":\s*""(?<Title>[^""]*)"",)?(\s*""actionDate"":\s*""(?<actionDate>[^""]*)"",)?(\s*""bandwidth_delta"":\s*(?<bandwidth_delta>\d*),)?(\s*""floor_exit_date"":\s*""(?<floor_exit_date>[^""]*)"",)?(\s*""locationId"":\s*(?<locationId>\d*),)?(\s*""max_people_served"":\s*(?<max_people_served>\d*),)?(\s*""max_seats_actual"":\s*(?<max_seats_actual>\d*),)?(\s*""max_seats_leveraged"":\s*(?<max_seats_leveraged>\d*),)?(\s*""planned_bandwidth"":\s*(?<planned_bandwidth>\d*),)?(\s*""projectId"":\s*(?<projectId>\d*),)?(\s*""running_container_w_hc"":\s*(?<running_container_w_hc>\d*),)?(\s*""sum_container_hc"":\s*(?<sum_container_hc>\d*),)?(\s*""sum_container_w_hc"":\s*(?<sum_container_w_hc>[^""]*),)?\s*""unique_key"":""\s*(?<unique_key>[^""]*)*"
    )
);

 

The issue I'm having is that not all of my items are coming through. For example, if I have 10 items in my JSON array, only 8 of them will come into my collection. It's not an exact percentage, so I would guess that my issues lies in that my regex does not cover all conditions. You will notice that I've defined 14 columns for my collection (in the regex), but my JSON only shows 12 columns. Due to the way the JSON function works in Power Apps, null or Blank columns do not get populated in the JSON string. I need to be able to populate up to all 14 columns, and I need to be able to get all rows every time.

 

My ultimate goal here to to only store this content in a JSON string in SharePoint, and read that JSON into a collection in Power Apps, Update the collection as-needed, and then update the JSON string in SharePoint. Currently I am storing all of these values in SharePoint rows, but it's vastly inefficient and slow for my needs.

 

This is my first time using RegEx, so looking for any possible help here.

1 ACCEPTED SOLUTION

Accepted Solutions
jdarling20
Helper I
Helper I

I was able to resolve this issue using the regex below:

 

"\{""Title"":""(?<Title>[^""]*)"",""actionDate"":""(?<actionDate>[^""]*)"",""bandwidth_delta"":\s*(?<bandwidth_delta>[^""]*),(?:""floor_exit_date"":\s*""(?<floor_exit_date>[^""]*)"",)?(?:""locationId"":\s*(?<locationId>[^""]*),)?(?:""max_people_served"":\s*(?<max_people_served>[^""]*),)?(?:""max_seats_actual"":\s*(?<max_seats_actual>[^""]*),)?(?:""max_seats_leveraged"":\s*(?<max_seats_leveraged>[^""]*),)?(?:""planned_bandwidth"":\s*(?<planned_bandwidth>[^""]*),)?(?:""projectId"":\s*(?<projectId>[^""]*),)?(?:""running_container_w_hc"":\s*(?<running_container_w_hc>[^""]*),)?(?:""sum_container_hc"":\s*(?<sum_container_hc>[^""]*),)?(?:""sum_container_w_hc"":\s*(?<sum_container_w_hc>[^""]*),)?""unique_key"":\s*""(?<unique_key>[^""]*)"""

 

It appears that some of my patterns to match for \d were not applying to all rows, thus causing my match to fail. I have been able to use this expression to match with all 956 objects in my JSON, all of which include the final (required) column for unique_key.

View solution in original post

1 REPLY 1
jdarling20
Helper I
Helper I

I was able to resolve this issue using the regex below:

 

"\{""Title"":""(?<Title>[^""]*)"",""actionDate"":""(?<actionDate>[^""]*)"",""bandwidth_delta"":\s*(?<bandwidth_delta>[^""]*),(?:""floor_exit_date"":\s*""(?<floor_exit_date>[^""]*)"",)?(?:""locationId"":\s*(?<locationId>[^""]*),)?(?:""max_people_served"":\s*(?<max_people_served>[^""]*),)?(?:""max_seats_actual"":\s*(?<max_seats_actual>[^""]*),)?(?:""max_seats_leveraged"":\s*(?<max_seats_leveraged>[^""]*),)?(?:""planned_bandwidth"":\s*(?<planned_bandwidth>[^""]*),)?(?:""projectId"":\s*(?<projectId>[^""]*),)?(?:""running_container_w_hc"":\s*(?<running_container_w_hc>[^""]*),)?(?:""sum_container_hc"":\s*(?<sum_container_hc>[^""]*),)?(?:""sum_container_w_hc"":\s*(?<sum_container_w_hc>[^""]*),)?""unique_key"":\s*""(?<unique_key>[^""]*)"""

 

It appears that some of my patterns to match for \d were not applying to all rows, thus causing my match to fail. I have been able to use this expression to match with all 956 objects in my JSON, all of which include the final (required) column for unique_key.

Helpful resources

Announcements
Microsoft 365 Conference – December 6-8, 2022

Microsoft 365 Conference – December 6-8, 2022

Join us in Las Vegas to experience community, incredible learning opportunities, and connections that will help grow skills, know-how, and more.

Top Solution Authors
Top Kudoed Authors
Users online (1,361)