• Building a simple Keras + deep learning REST API

    This is a guest post by Adrian Rosebrock. Adrian is the author of PyImageSearch.com, a blog about computer vision and deep learning. Adrian recently finished authoring Deep Learning for Computer Vision with Python, a new book on deep learning for computer vision and image recognition using Keras.


    In this tutorial, we will present a simple method to take a Keras model and deploy it as a REST API.

    When he came in to dinner he was affecting an air of busy abstraction. When Pen addressed him he would reply, gently:The pickled pork was comparatively free from this cause of irritation. It was all alike, and was simply "Hobson's choice." Si remembered the fragrant and delicious fried ham that so often garnished his mother's breakfast table and wondered why there was not the same proportion of hams and sides in the Commissary that he remembered in the meathouse on the Wabash. He remarked to Shorty one day:The examples covered in this post will serve as a template/starting point for building your own deep learning APIs — you will be able to extend the code and customize it based on how scalable and robust your API endpoint needs to be.

    "Seems to me we're git'n over the ground party lively to-day," replied Shorty, who was in a grumbling mood. "Wonder if the Gin'ral thinks we're bosses! I'm a little short o' wind, and these pesky gunboats are scrapin' the bark off'n my feet; but I'll keep up or bust."The Orderly, with a big package of letters in his hand, was calling out the names, and as the boys received their letters they distributed themselves through the camp, squatting about on rails or on the ground, devouring with the greatest avidity the welcome messages from home. The camp looked as if there had been a snowstorm.Specifically, we will learn:

    V1 Lent put a check on these festivities. "To-morrow," he tells Bourlamaque, "I shall throw myself into devotion with might and main ( corps perdu). It will be easier for me to detach myself from the world and turn heavenward here at Montreal than it would be at Quebec." And, some time after, "Bougainville spent Monday delightfully at Isle Ste. Hlne, and Tuesday devoutly with the Sulpitian Fathers at the Mountain. I was there myself at four o'clock, and did them the civility to sup in their refectory at a quarter before six.""Nothing doing! No back alley work for me. This is a first-rate public situation. Speak your piece."By the end of this tutorial you'll have a good understanding of the components (in their simplest form) that go into a creating Keras REST API.

    Pen shrugged. She had to make some excuse. "Last night was too much for me," she muttered.Feel free to use the code presented in this guide as a starting point for your own deep learning REST API.

    An ineffably lovely panorama was spread before them, which the light haze customary to that soft land, endowed with a curiously moving quality. For awhile in silence their eyes ranged back and forth from Absolom's Island on the one side out over the intenser blue of the Bay. At their feet rode a battered old schooner with a deckload of cord wood. Down at the left the octagonal lighthouse on its spindly legs was just within range of their vision.Note: The method covered here is intended to be instructional. It is not meant to be production-level and capable of scaling under heavy load. If you're interested in a more advanced Keras REST API that leverages message queues and batching, please refer to this tutorial.


    Configuring your development environment

    [667] Galt, Life of Benjamin West, I. 64 (ed. 1820).Si went back to his cooking, with the Pediculus still on his arm. He wanted to show it to Shorty. The Captain's profound explanation, with its large words, was a little too much for Si. He did not yet clearly comprehend the matter, and as he walked thoughtfully to where Shorty was "bilin'" the coffee he was trying to get through his head what it all meant.We'll be making the assumption that Keras is already configured and installed on your machine. If not, please ensure you install Keras using the official install instructions.

    "It is comfortable not to have to make pretenses," Pen said. That was as near as she could come to philandering.{sjtxt}From there, we'll need to install Flask (and its associated dependencies), a Python web framework, so we can build our API endpoint. We'll also need requests so we can consume our API as well.

    {sjtxt}{sjtxt}The relevant pip install commands are listed below:

    $ pip install flask gevent requests pillow
    

    Building your Keras REST API

    {sjtxt}{sjtxt}Our Keras REST API is self-contained in a single file named run_keras_server.py. We kept the installation in a single file as a manner of simplicity — the implementation can be easily modularized as well.

    {sjtxt}{sjtxt}Inside run_keras_server.py you'll find three functions, namely:

    {sjtxt}{sjtxt}The full code to this tutorial can be found here.

    # import the necessary packages
    from keras.applications import ResNet50
    from keras.preprocessing.image import img_to_array
    from keras.applications import imagenet_utils
    from PIL import Image
    import numpy as np
    import flask
    import io
    
    # initialize our Flask application and the Keras model
    app = flask.Flask(__name__)
    model = None
    

    {sjtxt}{sjtxt}Our first code snippet handles importing our required packages and initializing both the Flask application and our model.

    {sjtxt}{sjtxt}From there we define the load_model function:

    def load_model():
        # load the pre-trained Keras model (here we are using a model
        # pre-trained on ImageNet and provided by Keras, but you can
        # substitute in your own networks just as easily)
        global model
        model = ResNet50(weights="imagenet")
    

    {sjtxt}{sjtxt}As the name suggests, this method is responsible for instantiating our architecture and loading our weights from disk.

    {sjtxt}{sjtxt}For the sake of simplicity, we'll be utilizing the ResNet50 architecture which has been pre-trained on the ImageNet dataset.

    {sjtxt}{sjtxt}If you're using your own custom model you'll want to modify this function to load your architecture + weights from disk.

    {sjtxt}{sjtxt}Before we can perform prediction on any data coming from our client we first need to prepare and preprocess the data:

    def prepare_image(image, target):
        # if the image mode is not RGB, convert it
        if image.mode != "RGB":
            image = image.convert("RGB")
    
        # resize the input image and preprocess it
        image = image.resize(target)
        image = img_to_array(image)
        image = np.expand_dims(image, axis=0)
        image = imagenet_utils.preprocess_input(image)
    
        # return the processed image
        return image
    

    {sjtxt}{sjtxt}This function:

    {sjtxt}{sjtxt}Again, you should modify this function based on any preprocessing, scaling, and/or normalization you need prior to passing the input data through the model.

    {sjtxt}{sjtxt}We are now ready to define the predict function — this method processes any requests to the /predict endpoint:

    @app.route("/predict", methods=["POST"])
    def predict():
        # initialize the data dictionary that will be returned from the
        # view
        data = {"success": False}
    
        # ensure an image was properly uploaded to our endpoint
        if flask.request.method == "POST":
            if flask.request.files.get("image"):
                # read the image in PIL format
                image = flask.request.files["image"].read()
                image = Image.open(io.BytesIO(image))
    
                # preprocess the image and prepare it for classification
                image = prepare_image(image, target=(224, 224))
    
                # classify the input image and then initialize the list
                # of predictions to return to the client
                preds = model.predict(image)
                results = imagenet_utils.decode_predictions(preds)
                data["predictions"] = []
    
                # loop over the results and add them to the list of
                # returned predictions
                for (imagenetID, label, prob) in results[0]:
                    r = {"label": label, "probability": float(prob)}
                    data["predictions"].append(r)
    
                # indicate that the request was a success
                data["success"] = True
    
        # return the data dictionary as a JSON response
        return flask.jsonify(data)
    

    {sjtxt}{sjtxt}The data dictionary is used to store any data that we want to return to the client. Right now this includes a boolean used to indicate if prediction was successful or not — we'll also use this dictionary to store the results of any predictions we make on the incoming data.

    {sjtxt}{sjtxt}To accept the incoming data we check if:

    {sjtxt}{sjtxt}We then take the incoming data and:

    {sjtxt}{sjtxt}If you're working with non-image data you should remove the request.files code and either parse the raw input data yourself or utilize request.get_json() to automatically parse the input data to a Python dictionary/object. Additionally, consider giving following tutorial a read which discusses the fundamentals of Flask's request object.

    {sjtxt}{sjtxt}All that's left to do now is launch our service:

    # if this is the main thread of execution first load the model and
    # then start the server
    if __name__ == "__main__":
        print(("* Loading Keras model and Flask starting server..."
            "please wait until server has fully started"))
        load_model()
        app.run()
    

    {sjtxt}{sjtxt}First we call load_model which loads our Keras model from disk.

    {sjtxt}{sjtxt}The call to load_model is a blocking operation and prevents the web service from starting until the model is fully loaded. Had we not ensured the model is fully loaded into memory and ready for inference prior to starting the web service we could run into a situation where:

    1. A request is POST'ed to the server.
    2. The server accepts the request, preprocesses the data, and then attempts to pass it into the model
    3. ...but since the model isn't fully loaded yet, our script will error out!

    {sjtxt}{sjtxt}When building your own Keras REST APIs, ensure logic is inserted to guarantee your model is loaded and ready for inference prior to accepting requests.


    How to not load a Keras model in a REST API

    {sjtxt}{sjtxt}You may be tempted to load your model inside your predict function, like so:

    ...
        # ensure an image was properly uploaded to our endpoint
        if request.method == "POST":
            if request.files.get("image"):
                # read the image in PIL format
                image = request.files["image"].read()
                image = Image.open(io.BytesIO(image))
    
                # preprocess the image and prepare it for classification
                image = prepare_image(image, target=(224, 224))
    
                # load the model
                model = ResNet50(weights="imagenet")
    
                # classify the input image and then initialize the list
                # of predictions to return to the client
                preds = model.predict(image)
                results = imagenet_utils.decode_predictions(preds)
                data["predictions"] = []
    ...
    

    {sjtxt}{sjtxt}This code implies that the model will be loaded each and every time a new request comes in. This is incredibly inefficient and can even cause your system to run out of memory.

    {sjtxt}{sjtxt}If you try to run the code above you'll notice that your API will run considerably slower (especially if your model is large) — this is due to the significant overhead in both I/O and CPU operations used to load your model for each new request.

    {sjtxt}{sjtxt}To see how this can easily overwhelm your server's memory, let's suppose we have N incoming requests to our server at the same time. This implies there will be N models loaded into memory...again, at the same time. If your model is large, such as ResNet, storing N copies of the model in RAM could easily exhaust the system memory.

    {sjtxt}{sjtxt}To this end, try to avoid loading a new model instance for every new incoming request unless you have a very specific, justifiable reason for doing so.

    {sjtxt}{sjtxt}Caveat: We are assuming you are using the default Flask server that is single threaded. If you deploy to a multi-threaded server you could be in a situation where you are still loading multiple models in memory even when using the "more correct" method discussed earlier in this post. If you intend on using a dedicated server such as Apache or nginx you should consider making your pipeline more scalable, as discussed here.


    Starting your Keras Rest API

    {sjtxt}{sjtxt}Starting the Keras REST API service is easy.

    {sjtxt}{sjtxt}Open up a terminal and execute:

    $ python run_keras_server.py
    Using TensorFlow backend.
     * Loading Keras model and Flask starting server...please wait until server has fully started
    ...
     * Running on http://127.0.0.1:5000
    

    {sjtxt}{sjtxt}As you can see from the output, our model is loaded first — after which we can start our Flask server.

    {sjtxt}{sjtxt}You can now access the server via http://127.0.0.1:5000.

    {sjtxt}{sjtxt}However, if you were to copy and paste the IP address + port into your browser you would see the following image:

    {sjtxt}{sjtxt}keras api 404

    {sjtxt}{sjtxt}The reason for this is because there is no index/homepage set in the Flask URLs routes.

    {sjtxt}{sjtxt}Instead, try to access the /predict endpoint via your browser:

    {sjtxt}{sjtxt}keras api 404

    {sjtxt}{sjtxt}And you'll see a "Method Not Allowed" error. This error is due to the fact that your browser is performing a GET request, but /predict only accepts a POST (which we'll demonstrate how to perform in the next section).


    Using cURL to test the Keras REST API

    {sjtxt}{sjtxt}When testing and debugging your Keras REST API, consider using cURL (which is a good tool to learn how to use, regardless).

    {sjtxt}{sjtxt}Below you can see the image we wish to classify, a dog, but more specifically a beagle:

    {sjtxt}{sjtxt}dog

    {sjtxt}{sjtxt}We can use curl to pass this image to our API and find out what ResNet thinks the image contains:

    $ curl -X POST -F image=@dog.jpg 'http://localhost:5000/predict'
    {
      "predictions": [
        {
          "label": "beagle",
          "probability": 0.9901360869407654
        },
        {
          "label": "Walker_hound",
          "probability": 0.002396771451458335
        },
        {
          "label": "pot",
          "probability": 0.0013951235450804234
        },
        {
          "label": "Brittany_spaniel",
          "probability": 0.001283277408219874
        },
        {
          "label": "bluetick",
          "probability": 0.0010894243605434895
        }
      ],
      "success": true
    }
    

    {sjtxt}{sjtxt}The -X flag and POST value indicates we're performing a POST request.

    {sjtxt}{sjtxt}We supply -F image=@dog.jpg to indicate we're submitting form encoded data. The image key is then set to the contents of the dog.jpg file. Supplying the @ prior to dog.jpg implies we would like cURL to load the contents of the image and pass the data to the request.

    {sjtxt}{sjtxt}Finally, we have our endpoint: http://localhost:5000/predict

    {sjtxt}{sjtxt}Notice how the input image is correctly classified as "beagle" with 99.01% confidence. The remaining top-5 predictions and their associated probabilities and included in the response from our Keras API as well.


    Consuming the Keras REST API programmatically

    {sjtxt}{sjtxt}In all likelihood, you will be both submitting data to your Keras REST API and then consuming the returned predictions in some manner — this requires we programmatically handle the response from our server.

    {sjtxt}{sjtxt}This is a straightforward process using the requests Python package:

    # import the necessary packages
    import requests
    
    # initialize the Keras REST API endpoint URL along with the input
    # image path
    KERAS_REST_API_URL = "http://localhost:5000/predict"
    IMAGE_PATH = "dog.jpg"
    
    # load the input image and construct the payload for the request
    image = open(IMAGE_PATH, "rb").read()
    payload = {"image": image}
    
    # submit the request
    r = requests.post(KERAS_REST_API_URL, files=payload).json()
    
    # ensure the request was successful
    if r["success"]:
        # loop over the predictions and display them
        for (i, result) in enumerate(r["predictions"]):
            print("{}. {}: {:.4f}".format(i + 1, result["label"],
                result["probability"]))
    
    # otherwise, the request failed
    else:
        print("Request failed")
    

    {sjtxt}{sjtxt}The KERAS_REST_API_URL specifies our endpoint while the IMAGE_PATH is the path to our input image residing on disk.

    {sjtxt}{sjtxt}Using the IMAGE_PATH we load the image and then construct the payload to the request.

    {sjtxt}{sjtxt}Given the payload we can POST the data to our endpoint using a call to requests.post. Appending .json() to the end of the call instructs requests that:

    1. The response from the server should be in JSON
    2. We would like the JSON object automatically parsed and deserialized for us

    {sjtxt}{sjtxt}Once we have the output of the request, r, we can check if the classification is a success (or not) and then loop over r["predictions"].

    {sjtxt}{sjtxt}To run execute simple_request.py, first ensure run_keras_server.py (i.e., the Flask web server) is currently running. From there, execute the following command in a separate shell:

    $ python simple_request.py
    1. beagle: 0.9901
    2. Walker_hound: 0.0024
    3. pot: 0.0014
    4. Brittany_spaniel: 0.0013
    5. bluetick: 0.0011
    

    {sjtxt}{sjtxt}We have successfully called the Keras REST API and obtained the model's predictions via Python.


    {sjtxt}{sjtxt}In this post you learned how to:

    {sjtxt}{sjtxt}The code covered in this tutorial can he found here and is meant to be used as a template for your own Keras REST API — feel free to modify it as you see fit.

    {sjtxt}{sjtxt}Please keep in mind that the code in this post is meant to be instructional. It is not mean to be production-level and capable of scaling under heavy load and a large number of incoming requests.

    {sjtxt}{sjtxt}This method is best used when:

    1. You need to quickly stand up a REST API for your Keras deep learning model
    2. Your endpoint is not going to be hit heavily

    {sjtxt}{sjtxt}If you're interested in a more advanced Keras REST API that leverages message queues and batching, please refer to this blog post.

    {sjtxt}{sjtxt}If you have any questions or comments on this post please reach out to Adrian from PyImageSearch (the author of today's post). For suggestions on future topics to cover, please find Francois on Twitter.

    Other articles


    1. User experience design for APIs

      {sjtxt}{sjtxt}Writing code is rarely just a private affair between you and your computer. Code is not just meant for machines; it has human users. It is meant to be read by people, used by other developers, maintained and built upon. Developers who produce better code, in greater quantity, when they

      read more
    2. A ten-minute introduction to sequence-to-sequence learning in Keras

      {sjtxt}{sjtxt}Note: this post is from 2017. See this tutorial for an up-to-date version of the code used here.

      {sjtxt}{sjtxt}I see this question a lot -- how to implement RNN sequence-to-sequence learning in Keras? Here is a short introduction.

      {sjtxt}{sjtxt}Note that this post assumes that you already have some experience with recurrent

      read more
    3. The future of deep learning

      {sjtxt}{sjtxt}This post is adapted from Section 3 of Chapter 9 of my book, Deep Learning with Python (Manning Publications).

      {sjtxt}{sjtxt}Deep learning with Python

      {sjtxt}{sjtxt}It is part of a series of two posts on the current limitations of deep learning, and its future. You can read the first part here: The Limitations of Deep Learning

      read more
    4. The limitations of deep learning

      {sjtxt}{sjtxt}This post is adapted from Section 2 of Chapter 9 of my book, Deep Learning with Python (Manning Publications).

      {sjtxt}{sjtxt}Deep learning with Python

      {sjtxt}{sjtxt}It is part of a series of two posts on the current limitations of deep learning, and its future.

      {sjtxt}{sjtxt}This post is targeted at people who already have significant experience with

      read more
    5. Running Jupyter notebooks on GPU on AWS: a starter guide

      {sjtxt}{sjtxt}This is a step by step guide to start running deep learning Jupyter notebooks on an AWS GPU instance, while editing the notebooks from anywhere, in your browser. This is the perfect setup for deep learning research if you do not have a GPU on your local machine.

      What are

      read more
    6. Introducing Keras 2


      {sjtxt}{sjtxt}Keras was released two years ago, in March 2015. It then proceeded to grow from one user to one hundred thousand.

      {sjtxt}{sjtxt}Keras user growth

      {sjtxt}{sjtxt}Hundreds of people have contributed to the Keras codebase. Many thousands have contributed to the community. Keras has enabled new startups, made researchers more productive, simplified the workflows of

      read more
    7. Using pre-trained word embeddings in a Keras model

      {sjtxt}{sjtxt}Note: this post was originally written in July 2016. It is now mostly outdated. Please see this example of how to use pretrained word embeddings for an up-to-date alternative.

      {sjtxt}{sjtxt}In this tutorial, we will walk you through the process of solving a text classification problem using pre-trained word embeddings and

      read more
    8. On the importance of democratizing Artificial Intelligence

      {sjtxt}{sjtxt}We all know about the incredible progress that deep learning has made in recent years. In just 5 years, we went from near-unusable speech recognition and image recognition, to near-human accuracy. We went from machines that couldn't beat a serious Go player, to beating a world champion. We went further

      read more
    9. Building powerful image classification models using very little data

      {sjtxt}{sjtxt}Note: this post was originally written in June 2016. It is now very outdated. Please see this guide to fine-tuning for an up-to-date alternative, or check out chapter 8 of my book "Deep Learning with Python (2nd edition)".

      {sjtxt}{sjtxt}In this tutorial, we will present a few simple yet effective methods

      read more
    10. Building Autoencoders in Keras

      This post was written in early 2016. It is therefore badly outdated.

      {sjtxt}{sjtxt}In this tutorial, we will answer some common questions about autoencoders, and we will cover code examples of the following models:

      • a simple autoencoder based on a fully-connected layer
      • a sparse autoencoder
      • a deep fully-connected autoencoder
      • a deep
      read more
    11. Keras as a simplified interface to TensorFlow: tutorial

      {sjtxt}{sjtxt}Note: this post is from April 2016. It no longer reflects TensorFlow and Keras best practices. Keras has now been integrated into TensorFlow. Please see the keras.io documentation for details.

      A complete guide to using Keras as part of a TensorFlow workflow

      {sjtxt}{sjtxt}If TensorFlow is your primary framework, and

      read more
    12. Introducing Keras 1.0

      {sjtxt}{sjtxt}Keras was initially released a year ago, late March 2015. It has made tremendous progress since, both on the development front, and as a community.

      {sjtxt}{sjtxt}But continuous improvement isn't enough. A year of developing Keras, using Keras, and getting feedback from thousands of users has taught us a lot. To

      read more
    13. How convolutional neural networks see the world

      {sjtxt}{sjtxt}Note: this post was originally written in January 2016. It is now very outdated. Please see this example of how to visualize convnet filters for an up-to-date alternative, or check out chapter 9 of my book "Deep Learning with Python (2nd edition)".

      An exploration of convnet filters with Keras

      {sjtxt}{sjtxt}In

      read more
    14. Keras, now running on TensorFlow

      {sjtxt}{sjtxt}The purpose of Keras is to be a model-level framework, providing a set of "Lego blocks" for building Deep Learning models in a fast and straightforward way. Among Deep Learning frameworks, Keras is resolutely high up on the ladder of abstraction.

      {sjtxt}{sjtxt}As such, Keras does not handle itself low-level tensor

      read more

    波多野结衣大陆女热番号视频

    Copyright © 2015.All rights reserved.More welcome downlaod - Collect from power by 1042501000555666-timeccc.cn english

    Latest article news

    Sep-19 05:58:45mb (4261).txt