Machine Generated Data
Tags
Color Analysis
Face analysis
Amazon

AWS Rekognition
Age | 21-29 |
Gender | Male, 99.6% |
Calm | 61.5% |
Happy | 14.1% |
Disgusted | 10.7% |
Confused | 5.1% |
Sad | 5% |
Angry | 2% |
Surprised | 1.1% |
Fear | 0.6% |

AWS Rekognition
Age | 27-37 |
Gender | Female, 62.5% |
Calm | 60.4% |
Happy | 36.4% |
Sad | 1.3% |
Surprised | 0.4% |
Angry | 0.4% |
Disgusted | 0.4% |
Fear | 0.4% |
Confused | 0.3% |

AWS Rekognition
Age | 23-31 |
Gender | Male, 99.5% |
Calm | 86.6% |
Happy | 5.7% |
Sad | 3.6% |
Surprised | 1.6% |
Confused | 0.8% |
Angry | 0.8% |
Disgusted | 0.4% |
Fear | 0.4% |

AWS Rekognition
Age | 23-33 |
Gender | Male, 98.8% |
Surprised | 69.6% |
Sad | 11% |
Calm | 9.9% |
Happy | 3.3% |
Angry | 2.7% |
Disgusted | 2% |
Confused | 1% |
Fear | 0.5% |

AWS Rekognition
Age | 45-53 |
Gender | Male, 96% |
Calm | 89.7% |
Sad | 9.2% |
Confused | 0.6% |
Surprised | 0.1% |
Happy | 0.1% |
Disgusted | 0.1% |
Angry | 0.1% |
Fear | 0% |

Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |

Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |

Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |

Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |

Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Unlikely |
Blurred | Very unlikely |

Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
Feature analysis
Categories
Imagga
paintings art | 97.1% | |
people portraits | 2.2% | |
Captions
Microsoft
created by unknown on 2022-01-23
a group of people posing for the camera | 77.4% | |
a group of people posing for a picture | 77.3% | |
a group of people standing around a table | 75.4% | |
Google Gemini
Created by gemini-2.0-flash-lite on 2025-05-17
Here's a description of the image:
The image is a historical black and white negative print depicting a group of people indoors. It appears to be a portrait of five men, posed in an elegant room.
Here are some of the details visible:
- Composition: The group is arranged in a somewhat informal way. Two men sit in upholstered chairs, another on a sofa and two standing.
- Attire: The men are dressed in what appears to be formal or semi-formal attire. They are wearing white jackets and/or shirts and ties. One man wears a hat.
- Setting: The room has a decorative wall. There are framed pictures on the wall. There is a table or stand with a radio receiver.
- Style: The style of the photograph suggests it may be from the early 20th century.
Created by gemini-2.0-flash on 2025-05-17
Here is a description of the image:
This is a black and white photographic negative of a group of people, likely taken indoors.
There are five individuals in the image. The person standing on the left appears to be adjusting or operating a machine or device on a table, while the others are seated or standing nearby. The subjects are all dressed in clothing typical of the period, which appears to be early 20th century based on the style.
In the background, there are framed pictures hanging on the wall, and a window with curtains. The room seems to be furnished with period-appropriate furniture, including upholstered chairs and a patterned rug. The image has the numbers "11528" at the top and bottom, which may be an identification number. There is also some handwritten text along the right edge of the image.