.. _extraction-api: Direct API requests to ``findface-extraction-api`` ==================================================== You can use HTTP API to extract data directly from the ``findface-extraction-api`` component. .. note:: Being a ``findface-sf-api`` counterpart when it comes to face features extraction via API, ``findface-extraction-api`` is more resource-demanding. The component cannot fully substitute ``findface-sf-api`` as it doesn't allow adding faces and working with the database. .. tip:: Normalized images received from ``findface-extraction-api`` are qualified for posting to ``findface-sf-api``. .. rubric:: In this section: .. contents:: :local: API Requests -------------------------- The ``findface-extraction-api`` component accepts POST requests to ``http://127.0.0.1:18666/``. There are 2 ways to format the request body: * ``application/json``: the request body contains only JSON. * ``multipart/form-data``: the request body contains a JSON part with the request itself, other body parts are used for image transfer. The JSON part of the request body contains a set of requests: .. code:: { "requests": [request1, request2, .., requestN] "include_timings": true|false // include face processing timing in response, false by default } Each request in the set applies to a specific image or region in the image and accepts the following parameters: .. important:: To enable recognition of face features, you can use either the new (preferred) or old API parameters. The old API allows you to recognize gender, age, and emotions, while the new API provides recognition of gender, age, emotions, country, beard, and glasses. Each face feature (gender, age, emotions, country, beard, or glasses) must be mentioned only once in a request, either in the new or old API format. .. _auto-rotate: * ``"image"``: an uploaded image (use ``multipart:part`` to refer to a relevant request body ``part``), or a publicly accessible image URL (``http:``, ``https:``). * ``"roi"``: a region of interest in the image. If the region is not specified, the entire image is processed. * ``"detector"``: a face detector to apply to the image (``legacy``, ``nnd`` or ``prenormalized``). The ``prenormalized`` mode accepts normalized face images and omits detecting faces. Use ``nnd`` if you need to estimate the face quality (``"quality_estimator": true``). * ``"need_facen"``: if true, the request returns a facen in the response. * ``"need_gender"``: returns gender (old API). * ``"need_emotions"``: returns emotions (old API). * ``"need_age"``: returns age (old API). * ``"need_normalized"``: returns a normalized face image encoded in base64. The normalized image can then be posted again to the ``findface-extraction-api`` component as "prenormalized".  * ``"auto_rotate"``: if true, auto-rotates an original image to 4 different orientations and returns faces detected in each orientation. Works only if ``"detector": "nnd"`` and ``"quality_estimator": true``. * ``"attributes"``: array of strings in the format ``["gender", "age", "emotions", "countries47", "beard", "glasses3"]``, enables recognition of the face features passed in the array (new API). .. code:: { "image": "http://static.findface.pro/sample.jpg", "roi": {"left": 0, "right": 1000, "top": 0, "bottom": 1000}, "detector": "nnd", "need_facen": true, "need_gender": true, "need_emotions": true, "need_age": true, "need_normalized": true, "auto_rotate": true } API Response Format ----------------------------- A typical response from the ``findface-extraction-api`` component contains a set of responses to the requests wrapped into the main API request: .. code:: { "response": [response1, response2, .., responseN] } Each response in the set contains the following JSON data: * ``"faces"``: a set of faces detected in the provided image or region of interest. * ``"error"``: an error occurred during processing (if any). The error body includes the error code which can be interpreted automatically (``"code"``) and a human-readable description (``"desc"``). * ``"facen_model"``: face extraction model if ``"need_facen": true``. * ``"timings"``: processing timings if ``"include_timings": true``. .. code:: { "faces": [face1, face2, .., faceN], "error": { "code": "IMAGE_DECODING_FAILED", "desc": "Failed to decode: reason" } "facen_model": "elderberry_576", "timings": ... } Each face in the set is provided with the following data: .. _detection_score: * ``"bbox"``: coordinates of a bounding box with the face. * ``"detection_score"``: either the face detection accuracy, or the face quality score (depending on whether ``quality_estimator`` is ``false`` or ``true`` at ``/etc/findface-extraction-api.ini``). Upright faces in frontal position are considered the best quality. They result in values around ``0``, mostly negative (such as ``-0.00067401276``, for example). Inverted faces and large face angles are estimated with negative values some ``-5`` and less. * ``"facen"``: face feature vector. * ``"gender"``: gender information (MALE or FEMALE) with recognition accuracy if requested (old API). * ``"age"``: age estimate if requested (old API). * ``"emotions"``: all available emotions in descending order of probability if requested (old API).  * ``"countries47"``: probable countries of origin with algorithm confidence in the result if requested (old API). * ``"attributes"``: gender (``male`` or ``female``), age (number of years), emotions (predominant emotion), probable countries of origin, beard (``beard`` or ``none``), glasses (``sun``, ``eye``, or ``none``), along with algorithm confidence in the result if requested (new API). * ``"normalized"``: a normalized face image encoded in base64 if requested. * ``"timings"``: face processing timings if requested. .. code:: { "bbox": { "left": 1, "right": 2, "top": 3, "bottom": 4}, "detection_score": 0.99, "facen": "...", "gender": { "gender": "MALE", "score": "1.123" }, "age": 23.59, "emotions": [ { "emotion": "neutral", "score": 0.95 }, { "emotion": "angry", "score": 0.55 }, ... ], "normalized": "...", "attributes": { "age": { "attribute": "age", "model": "age.v1", "result": 25 }, "beard": { "attribute": "beard", "model": "beard.v0", "result": [ { "confidence": 0.015328666, "name": "beard" } ] }, "countries47": { "attribute": "countries47", "model": "countries47.v1", "result": [ { "confidence": 0.90330666, "name": "UKR" }, { "confidence": 0.013165677, "name": "RUS" }, { "confidence": 0.009136979, "name": "POL" }, ... ] }, "emotions": { "attribute": "emotions", "model": "emotions.v1", "result": [ { "confidence": 0.99959123, "name": "neutral" }, { "confidence": 0.00039093022, "name": "sad" }, { "confidence": 8.647058e-06, "name": "happy" }, { "confidence": 7.994732e-06, "name": "surprise" }, { "confidence": 6.495376e-07, "name": "disgust" }, { "confidence": 6.063106e-07, "name": "angry" }, { "confidence": 7.077886e-10, "name": "fear" } ] }, "gender": { "attribute": "gender", "model": "gender.v2", "result": [ { "confidence": 0.999894, "name": "female" }, { "confidence": 0.00010597264, "name": "male" } ] }, "glasses3": { "attribute": "glasses3", "model": "glasses3.v0", "result": [ { "confidence": 0.9995815, "name": "none" }, { "confidence": 0.0003348241, "name": "eye" }, { "confidence": 8.363914e-05, "name": "sun" } ] } } "timings": ... } Examples ------------------- .. rubric:: Request #1 .. code:: curl -X POST -F sample=@sample.jpg -F 'request={"requests":[{"image":"multipart:sample","detector":"nnd", "need_gender":true, "need_normalized": true, "need_facen": true}]}' http://127.0.0.1:18666/| jq .. rubric:: Response .. code:: { "responses": [ { "faces": [ { "bbox": { "left": 595, "top": 127, "right": 812, "bottom": 344 }, "detection_score": -0.0012599, "facen": "qErDPTE...vd4oMr0=", "gender": { "gender": "FEMALE", "score": -2.6415858 }, "normalized": "iVBORw0KGgoAAAANSUhE...79CIbv" } ] } ] } .. rubric:: Request #2 .. code:: curl -X POST -F 'request={"requests": [{"need_age": true, "need_gender": true, "detector": "nnd", "roi": {"left": -2975, "top": -635, "right": 4060, "bottom": 1720}, "image": "https://static.findface.pro/sample.jpg", "need_emotions": true}]}' http://127.0.0.1:18666/ |jq .. rubric:: Response .. code:: { "responses": [ { "faces": [ { "bbox": { "left": 595, "top": 127, "right": 812, "bottom": 344 }, "detection_score": 0.9999999, "gender": { "gender": "FEMALE", "score": -2.6415858 }, "age": 26.048346, "emotions": [ { "emotion": "neutral", "score": 0.90854686 }, { "emotion": "sad", "score": 0.051211596 }, { "emotion": "happy", "score": 0.045291856 }, { "emotion": "surprise", "score": -0.024765536 }, { "emotion": "fear", "score": -0.11788454 }, { "emotion": "angry", "score": -0.1723868 }, { "emotion": "disgust", "score": -0.35445923 } ] } ] } ] } .. rubric:: Request #3. Auto-rotation .. code:: curl -s -F 'sample=@/path/to/your/photo.png' -F 'request={"requests":[{"image":"multipart:sample","detector":"nnd", "auto_rotate": true, "need_normalized": true }]}' http://192.168.113.79:18666/ .. rubric:: Response .. code:: { "responses": [ { "faces": [ { "bbox": { "left": 96, "top": 99, "right": 196, "bottom": 198 }, "detection_score": -0.00019264, "normalized": "iVBORw0KGgoAAAANSUhE....quWKAAC" }, { "bbox": { "left": 205, "top": 91, "right": 336, "bottom": 223 }, "detection_score": -0.00041600747, "normalized": "iVBORw0KGgoAAAANSUhEUgAA....AByquWKAACAAElEQVR4nKy96XYbybIdnF" } ] } ] } .. rubric:: Request #4. New API usage (attributes: "beard", "emotions", "age", "gender", "glasses3", "face") .. code:: curl -s -F photo=@sample.jpg -Frequest='{"requests": [{"image":"multipart:photo", "detector": "nnd", "attributes": ["beard", "emotions", "age", "gender", "glasses3", "face"]}]}' http://127.0.0.1:18666 | jq .. rubric:: Response .. code:: { "responses": [ { "faces": [ { "bbox": { "left": 595, "top": 127, "right": 812, "bottom": 344 }, "detection_score": -0.00067401276, "rotation_angle": 0, "attributes": { "age": { "attribute": "age", "model": "age.v1", "result": 25 }, "beard": { "attribute": "beard", "model": "beard.v0", "result": [ { "confidence": 0.015324414, "name": "beard" } ] }, "emotions": { "attribute": "emotions", "model": "emotions.v1", "result": [ { "confidence": 0.99958, "name": "neutral" }, { "confidence": 0.0004020365, "name": "sad" }, { "confidence": 8.603454e-06, "name": "happy" }, { "confidence": 8.076766e-06, "name": "surprise" }, { "confidence": 6.6535216e-07, "name": "disgust" }, { "confidence": 6.1434775e-07, "name": "angry" }, { "confidence": 7.3372125e-10, "name": "fear" } ] }, "face": { "attribute": "face", "model": "elderberry_576", "result": "KjiHu6cWh70ppqa9l" }, "gender": { "attribute": "gender", "model": "gender.v2", "result": [ { "confidence": 0.9998938, "name": "female" }, { "confidence": 0.000106243206, "name": "male" } ] }, "glasses3": { "attribute": "glasses3", "model": "glasses3.v0", "result": [ { "confidence": 0.99958307, "name": "none" }, { "confidence": 0.00033243417, "name": "eye" }, { "confidence": 8.4465064e-05, "name": "sun" } ] } } } ], "orientation": 1 } ] }