Applications Of Artificial Intelligence - Tensorflow | Pandas


Tensorflow & Pandas
Tensorflow & Pandas



 We're gonna do this this is a cool useful tool we're gonna import map plot library type lot as PLT a standard import and this next line here is really important we're doing the map plot library we want it to be in line that's because we're in a jupiter notebook. Remember i told you before that if you're running this in a different ide you might get a different results this is the line you need if you want to see the graph on this page and then we do diabetes .We're gonna look at just the age and dot hist bins equals 20 what the heck is that well it turns out that because this is remember it's a pandas there's our PD pandas as PD because it's as a pandas. Panda automatically knows to look for the map plot library and a hist it just means it's gonna be a histogram and we're gonna separate it into 20 bins and that's what this graph is here .So when I take the diabetes and I do a histogram of it it produces this really nice graph we can see that at 22 most of the participants in this probably around 174 of the people that were recorded were of this age bracket they're right here in this this first column here and as they get further and further down they have less and less and less until when you're at 80.



Applications Of Artificial Intelligence 


 You can actually see down here there's almost none in a couple of these categories and this is important to know because when you're doing census you know the older the the group is the more of the people have passed away so you're gonna have this in any of your senses as you take you're always gonna have a much larger group in the younger crowd and then it gets lower and lower as they get older and I mentioned the word buckets here. We go we're gonna create an age bucket we're gonna put people in buckets can you imagine people sitting in buckets kind of a funny way of looking at it this just makes it easier for us to separate it .When I go into the dock office I don't want to be told well your h22 this is you know you'll get a decent one at 22 here we are back to my cup of coffee I told you coffee was an underlying theme these next couple steps are very specific to tensorflow up until now .We had some tensorflow as we set the categories up but you'd have to set that up with any model you use to make sure they're set correctly they'd be set up a little differently but we run input functions and in tensorflow this allows for some really cool things to happen and this is why tensorflow is so predominant in a lot of areas so let's take a look at these next couple lines of code here we're gonna create the input function and we're gonna go ahead and create the model and let me go ahead and put these codes in here I went ahead and split it up so as different lines we could talk a little bit about that and then the actual model and let me go ahead and run this so we can see what that looks like and in my particular environment it just prints out a little bit of information about the model not too worried about looking at that but we do want to take a closer look at what we did here so we've created an input function and again this is very specific to tensorflow.




 With the input function we have our train we have our x equals x train and our y equals y train because we want to train it with the particular information but we have these other settings in here these two settings and the number of epochs is how many times it's going to go over our training model epoch means large that means all the data .So we're gonna go over at a thousand times that's actually a huge overkill for this amount of data usually it only needs probably about 200 but you know when we're putting it together and you're trying things out you just kind of throw the numbers in there then you go back and fine-tune it sometimes the batch size is really important this is where tensorflow does some really cool things if you're processing this over a huge amount of data .and you try to batch everything at once you end up with a problem this means we're only gonna read 10 lines at a time through our data so each one of those rows of testing they've done we're only going to look at ten of them at a time and put that through our model and train it and then shuffles self-explanatory we're just moving we're just randomly selecting which data and what order we go in that way if there's like you know five in a row that are kind of weighted one way and vice versa it mixes them up and then finally we create our model so the model goes in there and goes okay. I have a TF estimated linear classifier we're gonna put in the feature columns equal your columns .we defined all our columns up above and then we have in classes equal to and that just means we have our out result is 0 or 1 you have diabetes or you don't have diabetes or in this case we actually call it high risk of diabetes and then I put one more line of code in there which. I forgot we've set up our model we set up our feature columns know we need to actually train it model dot train you'll see this so common in so many different neural network models this is like a standard what's different though is we have to feed it the function remember we created this function with all this information on it. and then we have steps and steps similar to number of batches and batch size it's more like individual lines we step through a thousand is a lot more common for steps than epics but steps is used you'd probably leave this out in this particular example and let's go ahead and run this all together because it has a site when we start training it we get a different output so here we go .

Features Of Pandas


I've run it it's given me the information from when I'm created the model and now it just starts going through we get this information tensor loss equals global step all kinds of stuff going on here and if you're following this it's just going through the steps in training it it gives you information it could dig deep into here but for this particular setup we're not gonna go too deep on what's going on just know that we've trained our model this model now has the information we need in it to start running predictions so as we sip our next take our next step of coffee or maybe it's tea or if you're one of those dreams late-night workers maybe it's a sip of wine .

We go into the next step and we actually want to run some predictions on here but we don't want to run the training we want to run the test on there we want to take our test data and see what it did and suppose we're gonna do next is we're gonna run the test through and actually get some answers so if you were actually deploying it you would pull the answers out of the data it's bringing back let's take a look at that in our Jupiter notebook so here .

We go let's paste it in there I'm gonna go ahead and run it and you'll see that as it goes it's actually putting the answers out if you remember correctly well walk through this here in just a second but it goes through and it runs each line and it gives you a prediction for each line one at a time and prints them out so let's take a look and see what that actually looks like let's start with this first part we've another function this is just like the other function except for x equals x test and there's no why why is there no y because we don't know what the answer is on this yet we don't want to tell it the answer we wanted to guess what the answer is so we can evaluate that and see how good it did on that 33% of the data so this is our X test batch size is 10 again .


So if we were watching this roll down here we would see that it actually processes at 10 lines at a time it's only going to go through once it goes through all the X test data one time we don't need to have it predict multiple times and shuffle equals false very important we set the shuffle to false because if we were tracking this and actually giving people answers we'd want to make sure it connects to the right person so they get the right results of course we're just doing this to evaluate it so let's take a look down here what I put out and as. I scroll down to my jupiter notebook we have some information as far as a tensor flow running and then the actual output and if we look at the output we know by this first bracket in python it's an array we know by this squiggly bracket that this is a dictionary so that this is a label the dictionary has logistic probabilities class IDs .classes and so this whole first part let me redo that this whole first part is one output and we know that because there is the bracket there and there is this bracket here for the dictionary and it's a dictionary of terms so if I pulled this out and I looked at object 0 in here. I would go down to let me just find it here it is classes remember how we defined classes we defined classes as our answer and so it's guessing.


Concept Of Tensorflow


  • Tensorflow says based on all the information you gave me I'm guessing this first entry of our first test data is high-risk diabetes oh go get tested change your diet watch what you're eating you know high risk one and if we go down far enough we can see down here is another one where classes equals zero .
  • I skipped a couple because they're all ones up here the B this particular output B in front of the 1 means it's a binary output it only comes out as 0 or 1 and there's a lot of other information in here you can dig deep into tensorflow and explain these different numbers that's way beyond the scope of what we're working on today so the basics of it though is you get an output we have an output of whether the person has diabetes or not well in this case it's high risk of diabetes or not so now that we've run our predictions take a sip of coffee and a short break and we say well what do we need to do well we need to know .
  • How good was our predictions we need to evaluate our model so if you're gonna publish this to a company or something like that they want to know how good you did let's take a look at what that looks like in the code so let's paste that in here and just real quick go back over this by now this function should look very this is an e we're gonna call it eval input function it's be pretty straightforward here .
  • We have our TF estimator inputs pandas input function and then we have our X test our Y test this time we want to give it the answer so it has something to see how good at it on batches 10 processing 10 at a time or let go once through the data we're not shuffling it although it doesn't really matter with this and then we're gonna do results equals model dot evaluate we put our evaluation function in there this should all look familiar so repeating the same thing it's very similar with a couple different changes as far as what we're feeding it and the fact that we're doing an evaluation let's see what that looks like .
  • We run it and we go up here to our run model you'll see warnings on some of these because I didn't install this on its own install so a lot of it is this temporary  files because I'm using the Jupiter notebook instead of setting it up on a regular machine and we get our output and we'll go ahead and just look at this output real quick and then we'll flip over and from here you'll see that it generates an accuracy a baseline average loss the mean gives you a precision prediction it gives you all kinds of information on here .
  • So let's flip over and see what's important and we'll look at the slide here we are in this slide and this should be exciting because we just about wrapped up our whole tensor flow and tensor flow is one of the more complicated models out there so give yourself a pat on the back for getting all the way through this when we look at this output we have an accuracy of 0.7 1 6 5 that's really what we want to look at that means that we have an accuracy if you're just truncating it of 71% that's quite good for our model you know give it a small amount of data we came up with a 71 percent of letting people know .
  • They're high risk or not with diabetes so we created a model that can predict if a person has diabetes based on some previous records of people who were diagnosed with diabetes and we managed to have an accuracy of 71% which is quite good the model was implemented on Python using tensorflow again pat yourself on the back because tensorflow is one of the more complicated scripts out there it's also one of the more diverse and useful ones so the key takeaways today is we've covered what is artificial intelligence.
  •  With a robot that brings us coffee and we noted that we are comparing it to how it reacts and looks like humans very important to note that in today's world where we're at and we looked at types of artificial intelligence from the reactive machines to limited memory and looking into the future of theory of mind and self-awareness .
  •  We went in there and took a look at taking photos and how artificial machine learning work we took a glance at deep learning with our neural networks and how we have hidden layers and our input layer and our output layer we didn't looked at a use case of somebody walking into a room and activating their TV using artificial intelligence or part of the AI category and finally we dug in deep and we did some coding in the tensor flow in Python with that let's wrap it up .

Post a Comment

0 Comments