Machine Generated Data
Tags
Color Analysis
Face analysis
Amazon
Microsoft
![](https://ids.lib.harvard.edu/ids/iiif/11001225/223,102,50,70/full/0/native.jpg)
AWS Rekognition
Age | 29-45 |
Gender | Male, 92.2% |
Sad | 6% |
Angry | 2.3% |
Happy | 1.7% |
Surprised | 4% |
Disgusted | 2.4% |
Calm | 81.5% |
Confused | 2% |
![](https://ids.lib.harvard.edu/ids/iiif/11001225/452,266,39,45/full/0/native.jpg)
AWS Rekognition
Age | 1-5 |
Gender | Female, 52.4% |
Happy | 45.1% |
Calm | 45% |
Sad | 54.6% |
Surprised | 45% |
Angry | 45.1% |
Disgusted | 45.1% |
Confused | 45% |
![](https://ids.lib.harvard.edu/ids/iiif/11001225/953,116,44,56/full/0/native.jpg)
AWS Rekognition
Age | 35-52 |
Gender | Female, 52.3% |
Calm | 49.2% |
Happy | 45.3% |
Surprised | 45.7% |
Confused | 45.9% |
Angry | 45.7% |
Sad | 45.9% |
Disgusted | 47.3% |
![](https://ids.lib.harvard.edu/ids/iiif/11001225/724,265,38,45/full/0/native.jpg)
AWS Rekognition
Age | 26-43 |
Gender | Female, 53.6% |
Confused | 45.8% |
Happy | 45.1% |
Angry | 46.2% |
Calm | 48.9% |
Sad | 48.5% |
Disgusted | 45.2% |
Surprised | 45.3% |
![](https://ids.lib.harvard.edu/ids/iiif/11001225/651,87,32,42/full/0/native.jpg)
AWS Rekognition
Age | 23-38 |
Gender | Female, 53.9% |
Angry | 45.1% |
Happy | 45% |
Sad | 54.7% |
Surprised | 45% |
Calm | 45% |
Disgusted | 45% |
Confused | 45% |
![](https://ids.lib.harvard.edu/ids/iiif/11001225/724,83,35,50/full/0/native.jpg)
AWS Rekognition
Age | 26-43 |
Gender | Female, 54.5% |
Surprised | 45.1% |
Disgusted | 45.1% |
Calm | 45.3% |
Sad | 53.8% |
Happy | 45.1% |
Confused | 45.3% |
Angry | 45.4% |
![](https://ids.lib.harvard.edu/ids/iiif/11001225/526,89,39,49/full/0/native.jpg)
AWS Rekognition
Age | 23-38 |
Gender | Female, 54% |
Disgusted | 45.1% |
Sad | 54.8% |
Calm | 45% |
Angry | 45.1% |
Surprised | 45% |
Confused | 45% |
Happy | 45% |
![](https://ids.lib.harvard.edu/ids/iiif/11001225/463,86,32,43/full/0/native.jpg)
AWS Rekognition
Age | 26-43 |
Gender | Female, 54.9% |
Disgusted | 45.2% |
Surprised | 45.1% |
Angry | 45.5% |
Sad | 53.8% |
Happy | 45.1% |
Calm | 45.2% |
Confused | 45.1% |
![](https://ids.lib.harvard.edu/ids/iiif/11001225/935,334,53,83/full/0/native.jpg)
AWS Rekognition
Age | 60-90 |
Gender | Male, 99.9% |
Disgusted | 0.6% |
Sad | 2% |
Happy | 0.8% |
Confused | 1.7% |
Angry | 2.5% |
Calm | 91.6% |
Surprised | 0.9% |
![](https://ids.lib.harvard.edu/ids/iiif/11001225/664,266,39,51/full/0/native.jpg)
AWS Rekognition
Age | 4-7 |
Gender | Female, 54.6% |
Confused | 45.1% |
Happy | 45.1% |
Sad | 52% |
Disgusted | 45% |
Surprised | 45.1% |
Angry | 45.2% |
Calm | 47.4% |
![](https://ids.lib.harvard.edu/ids/iiif/11001225/570,179,53,62/full/0/native.jpg)
AWS Rekognition
Age | 20-38 |
Gender | Female, 96.9% |
Calm | 33% |
Angry | 4.8% |
Confused | 4.5% |
Sad | 52.1% |
Surprised | 1.7% |
Happy | 1.8% |
Disgusted | 2% |
![](https://ids.lib.harvard.edu/ids/iiif/11001225/198,343,62,66/full/0/native.jpg)
AWS Rekognition
Age | 38-57 |
Gender | Male, 98.5% |
Sad | 3.8% |
Surprised | 1.4% |
Calm | 88.6% |
Disgusted | 0.5% |
Happy | 0.7% |
Confused | 2.3% |
Angry | 2.6% |
![](https://ids.lib.harvard.edu/ids/iiif/11001225/948,123,50,50/full/0/native.jpg)
Microsoft Cognitive Services
Age | 54 |
Gender | Male |
![](https://ids.lib.harvard.edu/ids/iiif/11001225/718,92,39,39/full/0/native.jpg)
Microsoft Cognitive Services
Age | 72 |
Gender | Female |
![](https://ids.lib.harvard.edu/ids/iiif/11001225/527,97,37,37/full/0/native.jpg)
Microsoft Cognitive Services
Age | 50 |
Gender | Female |
![](https://ids.lib.harvard.edu/ids/iiif/11001225/663,284,34,34/full/0/native.jpg)
Microsoft Cognitive Services
Age | 11 |
Gender | Male |
![](https://ids.lib.harvard.edu/ids/iiif/11001225/470,95,32,32/full/0/native.jpg)
Microsoft Cognitive Services
Age | 36 |
Gender | Female |
![](https://ids.lib.harvard.edu/ids/iiif/11001225/726,278,31,31/full/0/native.jpg)
Microsoft Cognitive Services
Age | 31 |
Gender | Female |
![](https://ids.lib.harvard.edu/ids/iiif/11001225/649,98,31,31/full/0/native.jpg)
Microsoft Cognitive Services
Age | 29 |
Gender | Female |
![](https://ids.lib.harvard.edu/ids/iiif/11001225/460,278,31,31/full/0/native.jpg)
Microsoft Cognitive Services
Age | 6 |
Gender | Male |
![](https://ids.lib.harvard.edu/ids/iiif/11001225/244,229,28,28/full/0/native.jpg)
Microsoft Cognitive Services
Age | 4 |
Gender | Female |
![](https://ids.lib.harvard.edu/ids/iiif/11001225/936,93,77,90/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Unlikely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/11001225/561,149,86,100/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/11001225/724,262,44,51/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/11001225/929,327,73,85/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
Feature analysis
Categories
Imagga
paintings art | 75.9% | |
people portraits | 19.6% | |
events parties | 3.4% | |
pets animals | 0.3% | |
text visuals | 0.2% | |
interior objects | 0.1% | |
nature landscape | 0.1% | |
food drinks | 0.1% | |
Captions
Microsoft
created on 2018-03-24
a group of people posing for a photo | 59.7% | |
a group of people posing for the camera | 59.6% | |
a group of people posing for a picture | 59.5% | |
Azure OpenAI
Created on 2024-01-25
This image is a detailed painting depicting a scene with several figures in what appears to be a luxurious interior setting, characterized by a decorative floor with geometric patterns and intricate architectural details, such as Corinthian columns and a throne-like chair. Two central figures are richly dressed in voluminous robes featuring deep blues and vibrant reds, seated with an intimate and regal air. To the left, an individual wearing a golden robe and ornate mitre is shown in a kneeling position, hands raised in prayer or veneration. On the right, a figure stands holding an open book, clad in a blue cloak draped over a pink garment, suggesting scholarly or religious authority. The background is adorned with elements like a gold-leafed wall, floral motifs, and a slender red standard, giving the setting a sumptuous feel. Decorative vases and a bird, likely symbolic, accompany the group, while the foreground shows the edge of a patterned rug or textile.
Anthropic Claude
Created on 2024-03-29
The image appears to be a religious painting depicting the Virgin Mary seated on a throne, holding the infant Jesus. She is surrounded by various figures, including angels, saints, and religious figures. The painting has a visually striking and ornate composition, with vibrant colors, intricate details, and a sense of devotion and reverence. The overall scene seems to portray a religious or sacred narrative, though I cannot identify any specific individuals in the image.