Human Generated Data

Title

Untitled (kids getting gifts from Santa Claus)

Date

December 1956

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17995

Human Generated Data

Title

Untitled (kids getting gifts from Santa Claus)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

December 1956

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Person 99.6
Human 99.6
Apparel 99.1
Clothing 99.1
Person 98.4
Person 98.4
Person 96.4
Person 96.1
Hat 96
Person 95.2
Accessories 89.4
Sunglasses 89.4
Accessory 89.4
Person 88.8
Meal 79.6
Food 79.6
Overcoat 77.9
Suit 77.9
Coat 77.9
Person 76.2
People 75.5
Plant 73.5
Person 72.1
Face 71.3
Gown 66.8
Fashion 66.8
Female 66.8
Robe 65.9
Photography 64.4
Photo 64.4
Blossom 63.1
Flower 63.1
Wedding 63
Flower Arrangement 62.6
Flower Bouquet 62.6
Crowd 59.9
Person 58.4
Person 55.6
Wedding Gown 55.3
Person 46.4

Imagga
created on 2022-03-04

man 29.5
people 25.6
person 22.9
photographer 20
male 17.7
musical instrument 17
spectator 16.1
men 15.4
bowed stringed instrument 15.2
adult 14.9
business 14.6
violin 12.2
stringed instrument 12.1
mask 11.9
businessman 11.5
professional 11.4
industrial 10.9
worker 10.7
working 10.6
human 10.5
protection 10
city 10
dirty 9.9
steel drum 9.7
group 9.7
women 9.5
work 9.4
industry 9.4
building 9.2
danger 9.1
team 9
job 8.8
percussion instrument 8.8
factory 8.7
gas 8.7
room 8.6
portrait 8.4
smoke 8.4
teamwork 8.3
office 8.2
equipment 8.1
uniform 8
clothing 7.9
device 7.7
holding 7.4
lifestyle 7.2
black 7.2
world 7.2
life 7.1
gun 7

Google
created on 2022-03-04

Microsoft
created on 2022-03-04

person 99
clothing 94.3
text 94
black and white 88.2
man 87.8
street 66.9
people 66.5
tree 57
funeral 55.8
crowd 0.7

Face analysis

Amazon

Google

AWS Rekognition

Age 18-26
Gender Female, 86.7%
Sad 51%
Calm 47.1%
Confused 0.7%
Angry 0.6%
Surprised 0.3%
Happy 0.1%
Fear 0.1%
Disgusted 0.1%

AWS Rekognition

Age 29-39
Gender Female, 89.1%
Calm 86.7%
Happy 4.4%
Sad 2.7%
Surprised 1.8%
Fear 1.6%
Disgusted 1.6%
Angry 0.6%
Confused 0.5%

AWS Rekognition

Age 21-29
Gender Female, 90.5%
Calm 74.1%
Sad 21.3%
Confused 2.5%
Fear 0.5%
Disgusted 0.5%
Happy 0.5%
Angry 0.3%
Surprised 0.3%

AWS Rekognition

Age 34-42
Gender Male, 96.6%
Calm 86.1%
Sad 7.3%
Happy 3.3%
Confused 2.1%
Disgusted 0.5%
Angry 0.4%
Surprised 0.2%
Fear 0.1%

AWS Rekognition

Age 39-47
Gender Female, 85%
Calm 98.2%
Sad 1.1%
Happy 0.2%
Disgusted 0.2%
Fear 0.1%
Surprised 0.1%
Angry 0.1%
Confused 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Hat 96%
Sunglasses 89.4%

Captions

Microsoft

a group of people standing in front of a crowd 87.9%
a group of people standing in front of a crowd of people 86.7%
a group of people in front of a crowd 86.6%

Text analysis

Google

MJI7--YT3RA°2--AGO
MJI7--YT3RA°2--AGO