Machine Generated Data
Tags
Color Analysis
Face analysis
Amazon
Microsoft
![](https://ids.lib.harvard.edu/ids/iiif/43159973/808,355,120,157/full/0/native.jpg)
AWS Rekognition
Age | 11-19 |
Gender | Female, 100% |
Disgusted | 43.8% |
Confused | 18.3% |
Fear | 10.8% |
Calm | 9.9% |
Surprised | 9.6% |
Angry | 6.3% |
Sad | 4.5% |
Happy | 1.1% |
![](https://ids.lib.harvard.edu/ids/iiif/43159973/221,469,109,153/full/0/native.jpg)
AWS Rekognition
Age | 6-14 |
Gender | Female, 99.9% |
Sad | 92.2% |
Calm | 42% |
Confused | 7.6% |
Surprised | 6.5% |
Fear | 6% |
Angry | 5% |
Disgusted | 0.6% |
Happy | 0.2% |
![](https://ids.lib.harvard.edu/ids/iiif/43159973/364,258,106,133/full/0/native.jpg)
AWS Rekognition
Age | 10-18 |
Gender | Female, 99.9% |
Calm | 93.1% |
Surprised | 6.3% |
Fear | 5.9% |
Sad | 3.2% |
Confused | 2.8% |
Angry | 0.7% |
Disgusted | 0.1% |
Happy | 0.1% |
![](https://ids.lib.harvard.edu/ids/iiif/43159973/633,269,37,51/full/0/native.jpg)
AWS Rekognition
Age | 54-64 |
Gender | Male, 99.3% |
Calm | 62.5% |
Confused | 22.9% |
Sad | 7.4% |
Surprised | 6.5% |
Fear | 6% |
Angry | 2.5% |
Disgusted | 2.1% |
Happy | 0.2% |
![](https://ids.lib.harvard.edu/ids/iiif/43159973/709,345,22,34/full/0/native.jpg)
AWS Rekognition
Age | 40-48 |
Gender | Female, 59.9% |
Sad | 100% |
Surprised | 6.3% |
Fear | 6.2% |
Calm | 3.1% |
Disgusted | 0.5% |
Happy | 0.4% |
Angry | 0.4% |
Confused | 0.2% |
![](https://ids.lib.harvard.edu/ids/iiif/43159973/820,381,125,125/full/0/native.jpg)
Microsoft Cognitive Services
Age | 31 |
Gender | Female |
![](https://ids.lib.harvard.edu/ids/iiif/43159973/234,503,117,117/full/0/native.jpg)
Microsoft Cognitive Services
Age | 11 |
Gender | Female |
![](https://ids.lib.harvard.edu/ids/iiif/43159973/367,280,106,106/full/0/native.jpg)
Microsoft Cognitive Services
Age | 13 |
Gender | Female |
![](https://ids.lib.harvard.edu/ids/iiif/43159973/342,234,155,181/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Possible |
Joy | Very unlikely |
Headwear | Unlikely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/43159973/771,311,190,221/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/43159973/185,434,180,210/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Unlikely |
Joy | Very unlikely |
Headwear | Possible |
Blurred | Very unlikely |
Feature analysis
Categories
Imagga
pets animals | 52.8% | |
people portraits | 25.3% | |
paintings art | 20.3% | |
Captions
Microsoft
created on 2018-05-11
a group of people posing for a photo | 96.6% | |
a group of people posing for the camera | 96.5% | |
an old photo of a group of people posing for the camera | 95.3% | |
Azure OpenAI
Created on 2024-01-26
This is a grayscale photograph featuring a group of individuals on a street. In the background, there is a two-story building that exhibits a degree of wear and age, marked by its weathered facade and wooden shutters. The photograph appears to have been taken in a bygone era, suggested by the vintage automobile parked on the side of the road. Tall palm trees and other mature vegetation can be seen lining the street. The context of the image suggests it might have been captured in an urban environment during the early to mid-20th century.
Anthropic Claude
Created on 2024-03-30
The image appears to be a black and white photograph taken outdoors, possibly in a small town or city setting. The photograph shows a group of people, including several women and children, standing on a sidewalk or street. One woman in the center of the image has a serious expression on her face, while the other individuals in the group have a range of expressions. In the background, there is a wooden building with a porch, and a vintage-style automobile can be seen parked on the street.