Machine Generated Data
Tags
Color Analysis
Face analysis
Amazon
Microsoft
![](https://ids.lib.harvard.edu/ids/iiif/32640544/112,226,48,54/full/0/native.jpg)
AWS Rekognition
Age | 23-38 |
Gender | Male, 54.7% |
Surprised | 45.2% |
Disgusted | 45.1% |
Calm | 54% |
Confused | 45.2% |
Angry | 45.2% |
Happy | 45.2% |
Sad | 45.2% |
![](https://ids.lib.harvard.edu/ids/iiif/32640544/856,242,40,60/full/0/native.jpg)
AWS Rekognition
Age | 35-52 |
Gender | Male, 51.3% |
Calm | 48.9% |
Confused | 45.6% |
Surprised | 45.7% |
Disgusted | 46.4% |
Happy | 45.4% |
Angry | 47.3% |
Sad | 45.7% |
![](https://ids.lib.harvard.edu/ids/iiif/32640544/509,195,88,108/full/0/native.jpg)
AWS Rekognition
Age | 57-77 |
Gender | Male, 97.1% |
Surprised | 1.9% |
Happy | 1.1% |
Disgusted | 18.3% |
Calm | 9.5% |
Sad | 24.8% |
Angry | 35.1% |
Confused | 9.3% |
![](https://ids.lib.harvard.edu/ids/iiif/32640544/22,214,64,84/full/0/native.jpg)
AWS Rekognition
Age | 48-68 |
Gender | Male, 78.7% |
Happy | 0.5% |
Confused | 0.3% |
Sad | 2.6% |
Angry | 0.5% |
Calm | 95.6% |
Disgusted | 0.2% |
Surprised | 0.3% |
![](https://ids.lib.harvard.edu/ids/iiif/32640544/265,258,49,61/full/0/native.jpg)
AWS Rekognition
Age | 48-68 |
Gender | Male, 52.2% |
Sad | 45.1% |
Calm | 54.6% |
Confused | 45.1% |
Angry | 45% |
Happy | 45.1% |
Disgusted | 45.1% |
Surprised | 45.1% |
![](https://ids.lib.harvard.edu/ids/iiif/32640544/74,268,48,61/full/0/native.jpg)
AWS Rekognition
Age | 23-38 |
Gender | Male, 54.4% |
Confused | 45.2% |
Angry | 45.4% |
Surprised | 45.2% |
Calm | 45.4% |
Happy | 45.1% |
Disgusted | 45.2% |
Sad | 53.5% |
![](https://ids.lib.harvard.edu/ids/iiif/32640544/680,306,47,50/full/0/native.jpg)
AWS Rekognition
Age | 35-52 |
Gender | Male, 50.1% |
Disgusted | 45.9% |
Happy | 45.4% |
Surprised | 45.6% |
Sad | 46% |
Calm | 50.3% |
Angry | 46.2% |
Confused | 45.6% |
![](https://ids.lib.harvard.edu/ids/iiif/32640544/740,282,28,35/full/0/native.jpg)
AWS Rekognition
Age | 35-52 |
Gender | Male, 51.5% |
Sad | 45.5% |
Disgusted | 45.6% |
Surprised | 45.2% |
Calm | 52.8% |
Angry | 45.4% |
Happy | 45.2% |
Confused | 45.2% |
![](https://ids.lib.harvard.edu/ids/iiif/32640544/810,276,35,43/full/0/native.jpg)
AWS Rekognition
Age | 26-43 |
Gender | Female, 53.2% |
Confused | 45.2% |
Sad | 45.3% |
Calm | 47.2% |
Disgusted | 50.2% |
Happy | 45.5% |
Surprised | 45.7% |
Angry | 45.9% |
![](https://ids.lib.harvard.edu/ids/iiif/32640544/203,264,32,56/full/0/native.jpg)
AWS Rekognition
Age | 14-25 |
Gender | Male, 54.7% |
Sad | 45.1% |
Calm | 54.2% |
Surprised | 45.1% |
Angry | 45.1% |
Disgusted | 45.2% |
Happy | 45.2% |
Confused | 45.1% |
![](https://ids.lib.harvard.edu/ids/iiif/32640544/618,292,26,30/full/0/native.jpg)
AWS Rekognition
Age | 20-38 |
Gender | Male, 53.7% |
Angry | 46.1% |
Disgusted | 47.6% |
Happy | 45.1% |
Sad | 46.4% |
Calm | 49.2% |
Confused | 45.4% |
Surprised | 45.1% |
![](https://ids.lib.harvard.edu/ids/iiif/32640544/1000,205,31,59/full/0/native.jpg)
AWS Rekognition
Age | 27-44 |
Gender | Male, 53.8% |
Happy | 45.1% |
Disgusted | 48.2% |
Calm | 47% |
Surprised | 45.4% |
Sad | 46.7% |
Confused | 45.6% |
Angry | 47% |
![](https://ids.lib.harvard.edu/ids/iiif/32640544/422,321,21,27/full/0/native.jpg)
AWS Rekognition
Age | 38-59 |
Gender | Male, 52.5% |
Sad | 46.2% |
Calm | 53% |
Angry | 45.5% |
Surprised | 45.1% |
Confused | 45.1% |
Happy | 45.1% |
Disgusted | 45% |
![](https://ids.lib.harvard.edu/ids/iiif/32640544/396,339,12,15/full/0/native.jpg)
AWS Rekognition
Age | 35-52 |
Gender | Female, 50.4% |
Disgusted | 49.5% |
Angry | 49.5% |
Calm | 49.9% |
Sad | 50% |
Happy | 49.5% |
Confused | 49.5% |
Surprised | 49.5% |
![](https://ids.lib.harvard.edu/ids/iiif/32640544/504,224,87,87/full/0/native.jpg)
Microsoft Cognitive Services
Age | 56 |
Gender | Male |
![](https://ids.lib.harvard.edu/ids/iiif/32640544/492,184,118,138/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/32640544/13,173,99,135/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
Feature analysis
Amazon
Person | 99.6% | |
Categories
Imagga
paintings art | 42.6% | |
interior objects | 32.9% | |
food drinks | 8.6% | |
people portraits | 8.2% | |
streetview architecture | 4.8% | |
events parties | 1.9% | |
text visuals | 0.6% | |
cars vehicles | 0.2% | |
pets animals | 0.1% | |
nature landscape | 0.1% | |
Captions
Microsoft
created on 2018-03-23
a group of people holding a sign | 81.8% | |
a group of people standing next to a man holding a sign | 75.4% | |
a group of people standing in the street | 75.3% | |
Azure OpenAI
Created on 2024-01-26
This is a black and white photograph depicting a scene on a busy street, with several people going about their activities. One individual stands out in the foreground, holding and possibly playing an accordion. In the background, there are signs for a business named "Liggett's Drugs" and other signs that appear to advertise candy and sundries. The attire of the individuals and the style of the signs suggest that this image might be from an earlier time period. The overall busy nature of the scene and the presence of the accordion player suggest a lively urban atmosphere.
Anthropic Claude
Created on 2024-03-29
The image appears to be a black and white photograph depicting a street scene in a city. The focal point is a man playing an accordion on the sidewalk, surrounded by a crowd of people. The man is wearing a hat and is concentrating on his instrument as he performs. In the background, there is a sign for a drugstore called "Liggett Drugs" and other buildings and signage visible, suggesting this is a bustling urban setting. The mood of the image conveys a sense of energy and activity on the city street.