Human Generated Data

Title

Untitled (christmas play cast in costume in classroom)

Date

December 15, 1953

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18011

Human Generated Data

Title

Untitled (christmas play cast in costume in classroom)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

December 15, 1953

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Clothing 99.5
Apparel 99.5
Human 98
Person 98
Person 97.8
Person 96.6
Person 96.4
Person 96.1
Person 95.9
Person 95.7
Person 94.4
Person 90.6
Person 87.8
Female 77.2
Costume 76.2
Dress 74.9
People 73.1
Face 72.8
Cloak 72.4
Fashion 72.4
Floor 70.9
Furniture 70.9
Chair 70.9
Indoors 70.1
Evening Dress 65.1
Robe 65.1
Gown 65.1
Person 64.8
Helmet 63.6
Person 63
Room 61.5
Girl 59.9
Flooring 59.1
Woman 57.5

Imagga
created on 2022-03-04

gown 52.3
vestment 51.7
clothing 49.5
outerwear 39
business 38.9
people 31.2
businessman 30
man 29.6
robe 28.6
garment 24.9
suit 23.3
corporate 23.2
metropolitan 22.5
covering 21.8
consumer goods 21.1
group 20.9
male 20.6
office 19.5
team 18.8
professional 18.7
executive 17.5
person 17.4
men 17.2
adult 16.7
success 16.1
work 14.9
modern 14.7
black 14.6
academic gown 14.3
job 14.1
meeting 14.1
clothes 14
teamwork 13.9
women 13.4
businesswoman 12.7
building 12.7
architecture 12.5
worker 12.4
silhouette 12.4
businesspeople 12.3
jacket 12
window 11.9
corporation 10.6
fashion 10.5
indoors 10.5
career 10.4
manager 10.2
indoor 10
dress 9.9
interior 9.7
urban 9.6
crowd 9.6
tie 9.5
happy 9.4
outfit 9.3
successful 9.1
attractive 9.1
handsome 8.9
lifestyle 8.7
employee 8.6
adults 8.5
casual 8.5
laptop 8.3
city 8.3
holding 8.2
confident 8.2
day 7.8
standing 7.8
full length 7.8
entrepreneur 7.7
travel 7.7
employment 7.7
wall 7.7
boss 7.6
finance 7.6
briefcase 7.6
shirt 7.5
style 7.4
20s 7.3
smiling 7.2
market 7.1
to 7.1
happiness 7

Google
created on 2022-03-04

Microsoft
created on 2022-03-04

person 99.2
text 96.8
clothing 94.6
black and white 87.6
black 83.6
standing 82.9
white 74.7
posing 66.7
group 65.1
funeral 64.1
man 52.6
old 46

Face analysis

Amazon

Google

AWS Rekognition

Age 33-41
Gender Male, 96.3%
Calm 98.4%
Surprised 0.7%
Sad 0.3%
Angry 0.2%
Confused 0.2%
Happy 0.1%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 29-39
Gender Male, 97.4%
Surprised 37.2%
Calm 28.2%
Angry 20.7%
Happy 5.9%
Fear 2.8%
Sad 2.7%
Confused 1.5%
Disgusted 1.1%

AWS Rekognition

Age 29-39
Gender Male, 99.7%
Calm 97.6%
Sad 1.7%
Angry 0.4%
Confused 0.1%
Surprised 0.1%
Disgusted 0%
Happy 0%
Fear 0%

AWS Rekognition

Age 25-35
Gender Male, 86.8%
Calm 54.7%
Fear 37.1%
Happy 4.6%
Sad 1.9%
Surprised 0.9%
Confused 0.3%
Disgusted 0.3%
Angry 0.2%

AWS Rekognition

Age 34-42
Gender Male, 99.9%
Calm 96.7%
Confused 1%
Sad 0.6%
Angry 0.5%
Surprised 0.5%
Happy 0.4%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 33-41
Gender Male, 100%
Calm 98.7%
Disgusted 0.2%
Sad 0.2%
Surprised 0.2%
Angry 0.2%
Confused 0.2%
Happy 0.2%
Fear 0.1%

AWS Rekognition

Age 26-36
Gender Female, 99.5%
Calm 94.6%
Surprised 2%
Sad 1.1%
Fear 0.7%
Disgusted 0.5%
Angry 0.5%
Confused 0.3%
Happy 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98%
Helmet 63.6%

Captions

Microsoft

a group of people posing for a photo 94.9%
a group of people standing in front of a crowd posing for the camera 92.5%
a group of people posing for the camera 92.4%

Text analysis

Amazon

PIECES
MASTER
EAA
KODAK-A

Google

YT37A°2
YT37A°2