cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
melnass
Frequent Visitor

Retrieving JSON data

Hello,

 

I have a flow that grabs data from an API call, parses it through Parse JSON, and then I am uploading the data that I am receiving in a database.

 

The JSON data that I am getting has multiple levels, so when I want to get the value I am using this expression

body('ParseScore')?['Level1']?['Level2']?['Level3']?['Level4']

 

Level 1-3 are setup as objects, and Level 4 is sometimes a string and other times as integers.

 

Some times when I am running the JSON, Level 3 is coming as string with "Insuffecient Data" which means I don't have level 4, so now when I try to read the data for level 4, I am getting this error: The template language expression 'body('ParseScore')?['Level1']?['Level2']?['Level3']?['Level4'] ' cannot be evaluated because property 'Level4' cannot be selected. Property selection is not supported on values of type 'String'. Please see https://aka.ms/logicexpressions for usage details.'.

 

Is there something that I can do to detect that level 4 doesn't exist? like an If Error? or do I need to something like Initialize variables with try catch so when a certain value doesn't exist, I can replace it with a string variable?

 

Just a little bit of more information, the JSON data I am getting has over 700 values that I need to upload to different tables in database, so checking each value individually will make the flow very long, so is there a way to use apply to each or something similar?

 

Thank you for your help

19 REPLIES 19
eliotcole
Super User
Super User

Hey, @melnass , is it possible that the runs that it's interpreting Level 3 as "Insuffecient Data" are runs that don't have any data *in* the Level 3 submission?

 

I've found that Flow can be pernickety with empty JSON values sometimes, so I just add in Conditions that check for empty values, and if there are any, don't process that value.

 

How you integrate that into your logic, of course, depends on your flow ... but  I tend to play out my logic with separate actions rather than the tidier way of putting as much in to expression work. So that others, after me, can understand the flow, even if they're not able (or willing) to read textual stuff. I say all that to excuse myself when I say that for me to handle these empty values I just create temporary arrays (using Variable actions) that hold data that needs to be processed at any stage in the flow. This way, if your Level 3 had a *lot* of data in it (meaning a potentially large section of processing in the flow) it would be run when present, or not when empty.

 

Another thing (which is, I admit) pointless, is set boolean variables to check for key values like these in the data, and if they're empty, then my "level3emptyVAR" would be set to True, meaning I can just check the boolean, instead. Again, I know that's pointless, but in my head it allows for future expansion should that need to be checked more than once, or changed based on other actions that go into the flow.

 

I know this reply may seem rambling, but just putting a few thoughts up there is all ... hope that you get it sorted!

@eliotcoleThe API is returning "Insuffecient data" for level 3 as a string, instead of returning level 3 as object with properties or values below it. So it isn't empty, it is returning a string with the words "Insuffecient data" like it doesn't have enough data point to make a decision on what to put in level 4.

 

I thought about putting conditions to check each value to make sure that it is there (it is fine if it is null, I just care that the branch, or level 4 is there) but that means I need to put over 700 to check if the value I want is available or not. Not all the values have 4 levels, some of them have 3, 2 or even only 1.

 

There are some values that are more probable to not be there, so if there is no easy way to check if all the values are there or not, then probably I can check for the more probable values.

 

My flow isn't really complicated, I am receiving the JSON file, parsing it and then uploading it to multiple database tables, I am not sure if there is a way to simplify it.

 

So I am just hoping for an easy way to check the values coming from the JSON like if reading this value gets an error then replace this value with null for example. Something similar to other programs like Excel or PowerBI.

 

Thanks

I think we probably need to see your flow, and the key formulas / expressions that go in there.

 

It's possible that the data that you're receiving in the Level 3 input (however that's coming in) is malformed, but that should provide an error before the one that you're seeing.

 

Have you tried "re-grabbing" the JSON parsing schema? It's entirely possible (I've done this a billion times) that you added a value to the data that's coming in to the flow, and the JSON schema isn't quite right for what it's seeing.

 

Seriously, this has happened with the tiniest of bloody elements with me ... so frustrating. Even a months old flow suddently saw new data and was just like ".... er ... I ain't touching that, brah.😣

Probably I should have made myself clearer. I call the API with different parameters, and most of the times I get all the values that I need, but there are very few cases where some of the expected values are missing. And I can't know what values I will be getting back until after I called the API.

That shouldn't matter, too much, @melnass , however if the API call that you're making is bringing in data that it's unequipped to handle then (and I could be wrong here, happily corrected) it'll error.

 

Basically, I think in the JSON schema you can set what's required and what's not ... but I'm no JSON expert. Perhaps that'll help you differentiate between corrupting elements in the pulled data?

 

Also, let's get some screenshots of the flow. 🙂

Well it isn't that something is required or not, I am trying to upload all the values that I am getting from JSON to the database, so I need everything.

 

I am attaching a screenshot of the flow to see if it will help explain more of what I am doing.

 

 

So I am calling the API with the data I am getting from the powerapp, parsing the json I am getting back and then uploading these values to the database. These values could be several levels deep in the JSON. So if something is 4 levels deep, that means levels 1,2 & 3 should be of type "object" and then level 4 can be a "string" or "integer", but in some cases I am getting this error

 

InvalidTemplate. Unable to process template language expressions in action 'InsertWeather' inputs at line '1' and column '2872': 'The template language expression 'body('ParseScore')?['risks']?['enhanced_hail_params']?['nearest_hail_incident']?['year'] ' cannot be evaluated because property 'year' cannot be selected. Property selection is not supported on values of type 'String'.

 

So in this error, Risks, enhanced_hail_params and nearest_hail_incident all should be "object" and then year should be string, but nearest_hail_incident is coming as a string, and there is no year.

 

Does this explain the situation better?

I am trying to upload a photo, but for some reason it isn't working. is there something special that I need to do to upload the image? I can see the image in the editor, but when I post or even click Preview then it disappears

Ok, I am trying to find a way to allow me to check all the variables coming from JSON before I try to upload them to the database.

I have uploaded all the value names into sharepoint and I am reading them into an array. The way I am getting the values from JSON is by typing body('ParseScore')?['risks']?['address']. Is there a way to read this string from sharepoint and let Power Automate execute it as a function? If that doesn't work, can I manually type body('ParseScore') but then the rest of the string (?['risks']?['address']) be read from sharepoint and flow would still recognize it as an expression?

 

 

Screenshots

@melnass are you on a PC or a MAC?

 

Both have the ability to take screenshots, I just don't know how it's done on a MAC. On a PC, just press the PrtScn (Print Screen) button on your keyboard, or:

  1. Press Start
  2. Type "On-Screen Keyboard" (without the quotes, obviously).
  3. Then tap the 'PrtScn' button to put a copy of the screen in your clipboard.
  4. Once you have that, paste it in to paint, or another image program, and save the image as a PNG or a JPG.

Then you can upload to this place. Some browsers may have issues with this editor, it's not a brilliant one.

 

But once you've managed to get the file, it's just a matter of tapping the camera button in the editor to upload images.

 

----------------------------------

I have to say that I really do think that in order to understand the issue we would need to know the actual way that it's coming in.

 

You don't have to show us literal data, but it's hard to understand what you're doing with it without pictorial (or more detailed explanation of each flow action) information to show us.

 

As such I may even have completely misinterpreted how the information is flowing in your flow here ... or not.

 

-----

 

A Wild Suggestion - Parallel Branches

Another suggestion that I'd like to propose is splitting the values entirely and running the processing of each level in its own parallel branch of the flow. Again, I'm keen to point out that I'm guessing here, and to try not to take the advice too literally, and to play around with it, as you might find your solution sideways from this. (I'm not looking for points or anything here! 🙂 ...)

 

So, if that JSON that's coming in is keeping each of the "Levels" separately, and each "Level" has more data within it, I'm going to assume that this is an array that contains the levels.

 

I've created an array, here:

[
  {
    "title": "Level1",
    "detail": "Stuff",
    "detail2": "Stuff2",
    "detail3": "Stuffdfsadfgsdgf3",
    "detail4": "Stuff4",
    "detail5": "Stuff5"
  },
  {
    "title": "Level2",
    "detail": "Stuff",
    "detail2": "Stuff2",
    "detail3": "Stuff3",
    "detail4": "Stuff4",
    "detail5": "Stuff5"
  },
  {
    "title": "Level3",
    "detail": "",
    "detail2": "Stuff2",
    "detail3": "",
    "detail4": "Stuff4",
    "detail5": ""
  },
  {
    "title": "Level4",
    "detail": "Stuff",
    "detail2": "Stuff2",
    "detail3": "Stuff3",
    "detail4": "Stuff4",
    "detail5": "Stuff5"
  },
  {
    "title": "Level5",
    "detail": "Stuff",
    "detail2": "Stuff2",
    "detail3": "Stuff3",
    "detail4": "Stuff4",
    "detail5": "Stuff5"
  }
]

Then I run an apply to each on the body of that array, and for each field that is available, I set the value of a variable, so that I can use it easily elsewhere in the flow, and I'll do that all simultaneously.

 

This way if anything is empty, it won't affect the other data coming through.

 

That could potentially get you past the first hurdle of handling the empty data.

split the load.jpg

 

A Field With Integer / String

 

Now you need to address data that is coming through as possibly integer and possibly string. I think that issue is on the other end of this equation, because if the value of the same field is sometimes coming in as an integer, then it suggests to me that the source isn't set up quite right.

 

This will cause parsing errors whatever you do, so my advice is validation on the other end as it feels like the data source isn't properly set up. If a field is to be an integer field, then make it an integer field, if not, then make it a string field that sometimes holds numerical data. Either way, format it as such.

 

If you set the source data to only be a string, then parse it as a string, then you can still treat any variants that are purely numerical using @MrFumble 's excellent integer checker, which uses very simple logic, here:

https://powerusers.microsoft.com/t5/Building-Flows/Test-if-string-is-numeric/td-p/264097

 

He uses the "fake condition" (my phrase), too, which is excellent ... It'll run one branch if the int() integer conversion expression fails and one if it succeeds. Then runs an extra step after each branch that runs whatever happens in the increment branch and only if the set variable 2 is successful (this allows the flow to continue).

MrFumble's Integer CheckerMrFumble's Integer Checker

 

 

 

 

I have taken the screenshot, it is just that I am not able to upload the image here. Let me try it again.

 

Flow Issue.jpg

 

Ok I am trying to upload the image again, and hope that it will work this time.

 

So as you can see (if the image is uploaded), I am getting the JSON file from ScoreByAddress call, then I am parsing it, and uploading it to the database

 

Flow Issue 2.jpg

Here is the expression that I am using to get the values into the database. So according to this "Municipal_boundry" should be an "object" that has values below it, and "gid" is either a "string" or an "integer". But what's happening sometimes is "Municipal_boundry" is coming as "string" which means it doesn't have a level below it. and this is what's causing the error for me.

 

I thought about initializing variables and checking each value that I want to see if it is available in the JSON or not, but since I have over 700 data points, that will make the flow very long, and I am not sure how the speed and cost of it will be affected.

 

So my next approach was to see if there is a way to create a loop or an array and then check the values that I need by creating a try-carch expression and then reading the expression @{body('ParseScore')?['risks']?['municipal_boundary']?['gid']} from a sharepoint list or an excel sheet, that way I won't have to manually check each value. Then if I get an error from a value, then I can manually assign a value. Finally I will upload the values to the database from the array and not from JSON. Hope that makes sense.

 

Let me know if this makes it clearer, and thank you very much for your help.

Have you tried creating a new Parse JSON with a sample of the JSON that's been failing, @melnass ?

 

(sorry if I've missed that you had ... just popped into my head)

@eliotcoleI guess I can do that, but I don't see what that would do to me.

 

I would still not be able to upload the values that I need to the database.

 

What I still need to figure out, is find a way to see if "gid" from the example I used before is available or not, and what I am not sure of is this could happen to other values that I need to upload to the database. So I prefer to find an easy way to check all the values that I need to upload to the database without having to go through them one by one in the flow.

 

Thanks

A fresh (completely new) JSON parse might just completely wipe out the error.

 

I can't tell you how often I've had to dance around some of the silly things that flow does and hangs on to. 😉

The current JSON schema I am using is the best one for most of the cases. Not sure if you can tell from the photos that I attached before, I am running the flow against different addresses, and some of the addresses are missing the information. So the JSON schema that I am using didn't change, it is just that some addresses don't have all the information that I am requesting.

Hey, @melnass , I'm an idiot, and completely didn't see that image (might've been a DNS/script thing, I'm a paranoid boy and block a lot) ...

 

I notice that there's no Apply to each stuff going on here, I'm not suggesting that there should be, but I'd imagine that it might come in to it.

 

What is this connector? I can't find an analogue on any of my flow accounts:

melnass image - what is it.jpg

 

Would I be right in assuming that with regards to this conversation so far:

  • LEVEL 1 = Address
  • LEVEL 2 = City
  • LEVEL 3 = State
  • LEVEL 4 = ZIP

It still looks like for each of those values that you could run a separate, parallel, branch of the flow to process the data in it.

 

This following advice is purely my own opinion, but I'd try using a few simple practices:

  1. I would also initialize variables for literally everything that I want to set later, right at the start of your flow. Then set the values with the information that comes in.
  2. I would run an "IF null" or "length = 0" Condition action for each of those levels, before I do anything with them. You'll work out 'if null'/'length=0' , but it's not *always* simple. This ensures that no pointless processing will happen.
  3. If one of those Parallel branches is failing occasionally, you can* always place a "Run after" setting on it. More in a second.
  4. Once you've run separate parallel branches for the variables you need to create, make the next action something to set any variables that need to be made by a combination of information created in those parallel branches.
  5. Fill in or action whatever you need to action, and end the flow!

 

I'll update this with more information, images, and more (a basic flow that does the above), later to help make it all make a little more sense, but it's the big game, so I gotta go!

 

Best

E

Thank you for your time to keep looking into this.

 

As for the connector you have in the image, this is a custom API connector that I built, it passes these variables (address, city, state and zip) and receives the JSON file back, and then I parse it to start uploading the data to the database.

 

As for the levels that I am talking about, I will reuse the example that I put before for the call @{body('ParseScore')?['risks']?['municipal_boundary']?['gid']}

Level 1: risks

Level 2: municipal boundary

Level 3: gid

 

This specific value I am looking at only has 3 levels. I need to upload the "gid" data to the database, "risks" and "municipal boundary" are of type "object" and don't contain data. For some addresses, "Municipal boundary" is coming as a string which means it doesn't have the "gid" branch for me to read and upload to the database. This is when I am getting an error.

 

Another string is ?['risks']?['municipal_boundary']?['type'], a third one is ?['enhanced_property']?[listing_record']?['status'] and so on

 

I thought about adding an Apply to each to cycle through all the values that I want to upload to the database, but in order to do that I need to find a way read this string or other strings that are similar to it "?['risks']?['municipal_boundary']?['gid']" from a source like Sharepoint (there are more than 700 of these strings, each one is different) and add it to the "Body" expression body('ParseScore'). If there is no way to do that, then I need to initialize more than 700 variables, and try to set their values with these strings and catch the ones that have errors, and then manually set them. But this process will make the flow very long and harder to maintain.

 

 

I'll admit now that without having access to the thing it's looking a bit hard to parse on the screen here through just descriptions.

 

With regards to your large data pulls, see if there's a way to restrict what you pull from the large lists using the ODATA filters, maybe combined with the ODATA sorting, and restrictions on how many results you pull. This can make that query relatively painless in terms of pulling down pointless information.

I am happy to set up a meeting with you and I can share my screen with you.

 

But I guess let me ask a question, is there a way to evaluate an expression from a string?

 

Thanks

You can do anything from a string, @melnass , you just need to get the string to process it. My personal recommendation there is to put the string into a variable which allows it to be used and managed in multiple parts of the flow.

 

Make a variable called:

melnassString

Then just process it however you'd process anything else. 👍

 

For what it's worth, you're already doing loads more stuff than I'm able to here. So it's more than probable that it'll be a case of some of my inane rambling jogging something with you!

Helpful resources

Announcements

Exclusive LIVE Community Event: Power Apps Copilot Coffee Chat with Copilot Studio Product Team

It's time for the SECOND Power Apps Copilot Coffee Chat featuring the Copilot Studio product team, which will be held LIVE on April 3, 2024 at 9:30 AM Pacific Daylight Time (PDT).     This is an incredible opportunity to connect with members of the Copilot Studio product team and ask them anything about Copilot Studio. We'll share our special guests with you shortly--but we want to encourage to mark your calendars now because you will not want to miss the conversation.   This live event will give you the unique opportunity to learn more about Copilot Studio plans, where we’ll focus, and get insight into upcoming features. We’re looking forward to hearing from the community, so bring your questions!   TO GET ACCESS TO THIS EXCLUSIVE AMA: Kudo this post to reserve your spot! Reserve your spot now by kudoing this post.  Reservations will be prioritized on when your kudo for the post comes through, so don't wait! Click that "kudo button" today.   Invitations will be sent on April 2nd.Users posting Kudos after April 2nd at 9AM PDT may not receive an invitation but will be able to view the session online after conclusion of the event. Give your "kudo" today and mark your calendars for April 3, 2024 at 9:30 AM PDT and join us for an engaging and informative session!

Tuesday Tip: Unlocking Community Achievements and Earning Badges

TUESDAY TIPS are our way of communicating helpful things we've learned or shared that have helped members of the Community. Whether you're just getting started or you're a seasoned pro, Tuesday Tips will help you know where to go, what to look for, and navigate your way through the ever-growing--and ever-changing--world of the Power Platform Community! We cover basics about the Community, provide a few "insider tips" to make your experience even better, and share best practices gleaned from our most active community members and Super Users.   With so many new Community members joining us each week, we'll also review a few of our "best practices" so you know just "how" the Community works, so make sure to watch the News & Announcements each week for the latest and greatest Tuesday Tips!     THIS WEEK'S TIP: Unlocking Achievements and Earning BadgesAcross the Communities, you'll see badges on users profile that recognize and reward their engagement and contributions. These badges each signify a different achievement--and all of those achievements are available to any Community member! If you're a seasoned pro or just getting started, you too can earn badges for the great work you do. Check out some details on Community badges below--and find out more in the detailed link at the end of the article!       A Diverse Range of Badges to Collect The badges you can earn in the Community cover a wide array of activities, including: Kudos Received: Acknowledges the number of times a user’s post has been appreciated with a “Kudo.”Kudos Given: Highlights the user’s generosity in recognizing others’ contributions.Topics Created: Tracks the number of discussions initiated by a user.Solutions Provided: Celebrates the instances where a user’s response is marked as the correct solution.Reply: Counts the number of times a user has engaged with community discussions.Blog Contributor: Honors those who contribute valuable content and are invited to write for the community blog.       A Community Evolving Together Badges are not only a great way to recognize outstanding contributions of our amazing Community members--they are also a way to continue fostering a collaborative and supportive environment. As you continue to share your knowledge and assist each other these badges serve as a visual representation of your valuable contributions.   Find out more about badges in these Community Support pages in each Community: All About Community Badges - Power Apps CommunityAll About Community Badges - Power Automate CommunityAll About Community Badges - Copilot Studio CommunityAll About Community Badges - Power Pages Community

Tuesday Tips: Powering Up Your Community Profile

TUESDAY TIPS are our way of communicating helpful things we've learned or shared that have helped members of the Community. Whether you're just getting started or you're a seasoned pro, Tuesday Tips will help you know where to go, what to look for, and navigate your way through the ever-growing--and ever-changing--world of the Power Platform Community! We cover basics about the Community, provide a few "insider tips" to make your experience even better, and share best practices gleaned from our most active community members and Super Users.   With so many new Community members joining us each week, we'll also review a few of our "best practices" so you know just "how" the Community works, so make sure to watch the News & Announcements each week for the latest and greatest Tuesday Tips!   This Week's Tip: Power Up Your Profile!  🚀 It's where every Community member gets their start, and it's essential that you keep it updated! Your Community User Profile is how you're able to get messages, post solutions, ask questions--and as you rank up, it's where your badges will appear and how you'll be known when you start blogging in the Community Blog. Your Community User Profile is how the Community knows you--so it's essential that it works the way you need it to! From changing your username to updating contact information, this Knowledge Base Article is your best resource for powering up your profile.     Password Puzzles? No Problem! Find out how to sync your Azure AD password with your community account, ensuring a seamless sign-in. No separate passwords to remember! Job Jumps & Email Swaps Changed jobs? Got a new email? Fear not! You'll find out how to link your shiny new email to your existing community account, keeping your contributions and connections intact. Username Uncertainties Unraveled Picking the perfect username is crucial--and sometimes the original choice you signed up with doesn't fit as well as you may have thought. There's a quick way to request an update here--but remember, your username is your community identity, so choose wisely. "Need Admin Approval" Warning Window? If you see this error message while using the community, don't worry. A simple process will help you get where you need to go. If you still need assistance, find out how to contact your Community Support team. Whatever you're looking for, when it comes to your profile, the Community Account Support Knowledge Base article is your treasure trove of tips as you navigate the nuances of your Community Profile. It’s the ultimate resource for keeping your digital identity in tip-top shape while engaging with the Power Platform Community. So, dive in and power up your profile today!  💪🚀   Community Account Support | Power Apps Community Account Support | Power AutomateCommunity Account Support | Copilot Studio  Community Account Support | Power Pages

Super User of the Month | Chris Piasecki

In our 2nd installment of this new ongoing feature in the Community, we're thrilled to announce that Chris Piasecki is our Super User of the Month for March 2024. If you've been in the Community for a while, we're sure you've seen a comment or marked one of Chris' helpful tips as a solution--he's been a Super User for SEVEN consecutive seasons!   Since authoring his first reply in April 2020 to his most recent achievement organizing the Canadian Power Platform Summit this month, Chris has helped countless Community members with his insights and expertise. In addition to being a Super User, Chris is also a User Group leader, Microsoft MVP, and a featured speaker at the Microsoft Power Platform Conference. His contributions to the new SUIT program, along with his joyous personality and willingness to jump in and help so many members has made Chris a fixture in the Power Platform Community.   When Chris isn't authoring solutions or organizing events, he's actively leading Piasecki Consulting, specializing in solution architecture, integration, DevOps, and more--helping clients discover how to strategize and implement Microsoft's technology platforms. We are grateful for Chris' insightful help in the Community and look forward to even more amazing milestones as he continues to assist so many with his great tips, solutions--always with a smile and a great sense of humor.You can find Chris in the Community and on LinkedIn. Thanks for being such a SUPER user, Chris! 💪 🌠  

Find Out What Makes Super Users So Super

We know many of you visit the Power Platform Communities to ask questions and receive answers. But do you know that many of our best answers and solutions come from Community members who are super active, helping anyone who needs a little help getting unstuck with Business Applications products? We call these dedicated Community members Super Users because they are the real heroes in the Community, willing to jump in whenever they can to help! Maybe you've encountered them yourself and they've solved some of your biggest questions. Have you ever wondered, "Why?"We interviewed several of our Super Users to understand what drives them to help in the Community--and discover the difference it has made in their lives as well! Take a look in our gallery today: What Motivates a Super User? - Power Platform Community (microsoft.com)

March User Group Update: New Groups and Upcoming Events!

  Welcome to this month’s celebration of our Community User Groups and exciting User Group events. We’re thrilled to introduce some brand-new user groups that have recently joined our vibrant community. Plus, we’ve got a lineup of engaging events you won’t want to miss. Let’s jump right in: New User Groups   Sacramento Power Platform GroupANZ Power Platform COE User GroupPower Platform MongoliaPower Platform User Group OmanPower Platform User Group Delta StateMid Michigan Power Platform Upcoming Events  DUG4MFG - Quarterly Meetup - Microsoft Demand PlanningDate: 19 Mar 2024 | 10:30 AM to 12:30 PM Central America Standard TimeDescription: Dive into the world of manufacturing with a focus on Demand Planning. Learn from industry experts and share your insights. Dynamics User Group HoustonDate: 07 Mar 2024 | 11:00 AM to 01:00 PM Central America Standard TimeDescription: Houston, get ready for an immersive session on Dynamics 365 and the Power Platform. Connect with fellow professionals and expand your knowledge. Reading Dynamics 365 & Power Platform User Group (Q1)Date: 05 Mar 2024 | 06:00 PM to 09:00 PM GMT Standard TimeDescription: Join our virtual meetup for insightful discussions, demos, and community updates. Let’s kick off Q1 with a bang! Leaders, Create Your Events!  Leaders of existing User Groups, don’t forget to create your events within the Community platform. By doing so, you’ll enable us to share them in future posts and newsletters. Let’s spread the word and make these gatherings even more impactful! Stay tuned for more updates, inspiring stories, and collaborative opportunities from and for our Community User Groups.   P.S. Have an event or success story to share? Reach out to us – we’d love to feature you!

Users online (5,571)