Machine Generated Data
Tags
Color Analysis
Face analysis
Amazon
Microsoft
![](https://ids.lib.harvard.edu/ids/iiif/17358183/254,689,141,169/full/0/native.jpg)
AWS Rekognition
Age | 19-27 |
Gender | Female, 98.1% |
Calm | 94.5% |
Surprised | 6.4% |
Fear | 6.1% |
Sad | 2.8% |
Confused | 1.8% |
Angry | 0.3% |
Disgusted | 0.3% |
Happy | 0.3% |
![](https://ids.lib.harvard.edu/ids/iiif/17358183/1068,547,184,177/full/0/native.jpg)
AWS Rekognition
Age | 13-21 |
Gender | Female, 99.9% |
Sad | 100% |
Surprised | 6.3% |
Fear | 6% |
Calm | 4.7% |
Confused | 1% |
Disgusted | 0.8% |
Angry | 0.2% |
Happy | 0.1% |
![](https://ids.lib.harvard.edu/ids/iiif/17358183/2206,628,140,192/full/0/native.jpg)
AWS Rekognition
Age | 30-40 |
Gender | Male, 95.1% |
Calm | 84.3% |
Surprised | 8.1% |
Sad | 7.4% |
Fear | 6% |
Confused | 1.6% |
Angry | 0.7% |
Happy | 0.6% |
Disgusted | 0.4% |
![](https://ids.lib.harvard.edu/ids/iiif/17358183/434,699,151,179/full/0/native.jpg)
AWS Rekognition
Age | 33-41 |
Gender | Male, 99.9% |
Sad | 100% |
Surprised | 6.3% |
Fear | 5.9% |
Calm | 1.1% |
Confused | 0.3% |
Angry | 0.1% |
Disgusted | 0.1% |
Happy | 0% |
![](https://ids.lib.harvard.edu/ids/iiif/17358183/1804,1225,145,218/full/0/native.jpg)
AWS Rekognition
Age | 21-29 |
Gender | Female, 99.9% |
Calm | 99.8% |
Surprised | 6.3% |
Fear | 5.9% |
Sad | 2.2% |
Confused | 0% |
Angry | 0% |
Happy | 0% |
Disgusted | 0% |
![](https://ids.lib.harvard.edu/ids/iiif/17358183/761,665,137,192/full/0/native.jpg)
AWS Rekognition
Age | 37-45 |
Gender | Female, 95.6% |
Calm | 84.1% |
Surprised | 10.1% |
Fear | 6.3% |
Angry | 4.6% |
Sad | 2.9% |
Confused | 1.6% |
Disgusted | 0.3% |
Happy | 0.2% |
![](https://ids.lib.harvard.edu/ids/iiif/17358183/1985,677,141,160/full/0/native.jpg)
AWS Rekognition
Age | 24-34 |
Gender | Female, 99.8% |
Calm | 99.9% |
Surprised | 6.3% |
Fear | 5.9% |
Sad | 2.2% |
Confused | 0% |
Happy | 0% |
Angry | 0% |
Disgusted | 0% |
![](https://ids.lib.harvard.edu/ids/iiif/17358183/1543,820,122,136/full/0/native.jpg)
AWS Rekognition
Age | 1-7 |
Gender | Female, 70.8% |
Calm | 97.9% |
Surprised | 6.3% |
Fear | 5.9% |
Sad | 2.5% |
Confused | 0.4% |
Disgusted | 0.2% |
Happy | 0.2% |
Angry | 0.1% |
![](https://ids.lib.harvard.edu/ids/iiif/17358183/828,1234,147,201/full/0/native.jpg)
AWS Rekognition
Age | 31-41 |
Gender | Male, 100% |
Calm | 91.6% |
Surprised | 6.5% |
Fear | 6% |
Confused | 3.4% |
Sad | 2.5% |
Disgusted | 1.9% |
Angry | 0.9% |
Happy | 0.3% |
![](https://ids.lib.harvard.edu/ids/iiif/17358183/1768,659,142,185/full/0/native.jpg)
AWS Rekognition
Age | 54-62 |
Gender | Male, 100% |
Calm | 98.1% |
Surprised | 6.3% |
Fear | 5.9% |
Sad | 2.5% |
Confused | 0.6% |
Angry | 0% |
Happy | 0% |
Disgusted | 0% |
![](https://ids.lib.harvard.edu/ids/iiif/17358183/1764,1272,140,140/full/0/native.jpg)
Microsoft Cognitive Services
Age | 21 |
Gender | Female |
![](https://ids.lib.harvard.edu/ids/iiif/17358183/438,750,147,147/full/0/native.jpg)
Microsoft Cognitive Services
Age | 34 |
Gender | Male |
![](https://ids.lib.harvard.edu/ids/iiif/17358183/2179,675,131,131/full/0/native.jpg)
Microsoft Cognitive Services
Age | 30 |
Gender | Female |
![](https://ids.lib.harvard.edu/ids/iiif/17358183/289,720,128,128/full/0/native.jpg)
Microsoft Cognitive Services
Age | 22 |
Gender | Female |
![](https://ids.lib.harvard.edu/ids/iiif/17358183/795,693,120,120/full/0/native.jpg)
Microsoft Cognitive Services
Age | 22 |
Gender | Male |
![](https://ids.lib.harvard.edu/ids/iiif/17358183/1573,864,89,89/full/0/native.jpg)
Microsoft Cognitive Services
Age | 4 |
Gender | Female |
![](https://ids.lib.harvard.edu/ids/iiif/17358183/2176,595,212,246/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/17358183/1067,497,212,245/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Possible |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/17358183/727,637,190,221/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/17358183/216,644,198,230/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Unlikely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/17358183/1765,1192,230,267/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/17358183/422,680,188,217/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/17358183/1532,819,134,157/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/17358183/757,1170,247,287/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/17358183/1980,642,171,197/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Very unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
![](https://ids.lib.harvard.edu/ids/iiif/17358183/1741,639,173,200/full/0/native.jpg)
Google Vision
Surprise | Very unlikely |
Anger | Very unlikely |
Sorrow | Unlikely |
Joy | Very unlikely |
Headwear | Very unlikely |
Blurred | Very unlikely |
Feature analysis
Categories
Imagga
people portraits | 61.9% | |
events parties | 37.6% | |
streetview architecture | 0.2% | |
pets animals | 0.1% | |
paintings art | 0.1% | |
Captions
Microsoft
created on 2018-08-20
a group of people posing for the camera | 93.9% | |
a group of people posing for a picture | 93.7% | |
a group of people posing for a photo | 91% | |
Azure OpenAI
Created on 2024-01-26
This is a Renaissance painting featuring a central figure draped in blue and red garments, holding a child on their lap. To the left and right of the central figure are characters dressed in period clothing, and in the background, there are serene landscape scenes depicting rolling hills, trees, structures, and a clear sky. The middle panel showcases a patterned background, while the outer panels display outdoor settings that contribute to the painting's tranquil and bucolic atmosphere.
Anthropic Claude
Created on 2024-03-29
The image depicts a religious scene featuring the Virgin Mary holding the infant Jesus, surrounded by several other figures. In the center, the Virgin Mary is seated, wearing a red robe and white veil, holding the baby Jesus on her lap. She is flanked by two male figures, one elderly and one younger, who appear to be Saint Joseph and another male saint. In the foreground, there are several other figures, including a young man with long hair and a halo, possibly representing one of the apostles, and two women, one with a white headdress and one with long blonde hair. The background features a detailed, ornate tapestry or textile hanging, as well as a pastoral landscape with buildings in the distance. The overall composition and style suggest this is a Renaissance-era religious painting.