Human Generated Data

Title

Untitled (children and mothers at yard party)

Date

c. 1950

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.467

Human Generated Data

Title

Untitled (children and mothers at yard party)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.467

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.6
Human 99.6
Person 99.5
Person 99.5
Person 98.8
Person 98.4
Clothing 97.2
Apparel 97.2
Person 97.1
Car 97.1
Automobile 97.1
Transportation 97.1
Vehicle 97.1
Person 95.5
Person 94.3
Person 94.3
Person 92
Shorts 91.6
Grass 91.4
Plant 91.4
People 91.3
Meal 90.4
Food 90.4
Person 86.7
Person 86
Person 82.5
Linen 81.8
Home Decor 81.8
Dress 81.5
Person 81.2
Shoe 78.3
Footwear 78.3
Crowd 75.8
Shoe 74.1
Picnic 73.6
Leisure Activities 73.6
Vacation 73.6
Outdoors 73.5
Female 72
Tree 71
Shoe 70.5
Face 65
Photography 64.5
Photo 64.5
Gown 59.3
Fashion 59.3
Shoe 58.7
Robe 58.1
Wedding 57.3
Bridegroom 55.1

Clarifai
created on 2023-10-26

people 100
child 99.7
group together 99.4
group 99
many 98.7
wear 98
adult 97.2
recreation 95.7
uniform 95.5
woman 94.6
boy 94.3
man 93.8
several 92.4
outfit 91
administration 89.4
leader 87.1
spectator 85.4
monochrome 84.9
wedding 82.9
war 80.8

Imagga
created on 2022-01-23

military uniform 100
uniform 100
clothing 80
consumer goods 55.7
covering 55.1
commodity 27.8
people 24
man 22.2
male 16.3
person 14.8
military 14.5
travel 13.4
men 12.9
adult 12.5
history 12.5
war 12.5
family 12.5
outdoor 12.2
old 11.8
tourism 11.6
group 11.3
outdoors 11.2
horse 10.4
portrait 10.4
tourist 10
soldier 9.8
army 9.8
couple 9.6
women 9.5
walking 9.5
happiness 9.4
industrial 9.1
park 9.1
happy 8.8
mask 8.6
culture 8.5
historic 8.3
danger 8.2
nurse 8.2
rifle 8.1
transportation 8.1
gun 8
together 7.9
animal 7.9
summer 7.7
bride 7.7
child 7.6
statue 7.6
doctor 7.5
mother 7.4
tradition 7.4
spectator 7.3
new 7.3
suit 7.2
smile 7.1
mountain 7.1
grass 7.1
work 7.1

Google
created on 2022-01-23

Dress 89.3
Gesture 85.2
Motor vehicle 82.4
Table 76.3
Vintage clothing 76.3
Classic 74.6
Tree 74.5
Event 72.8
Monochrome 69.6
Vehicle 69.5
Car 65.9
Monochrome photography 64
Fun 63.5
Photo caption 62.9
Room 62.2
Stock photography 61.8
Plant 61
Chair 57.6
History 57.4
Sharing 57

Microsoft
created on 2022-01-23

outdoor 99.1
tree 98.9
person 97.4
grass 95.7
clothing 92.9
dress 91.6
woman 81.9
black 79.8
standing 79.4
footwear 76.2
wedding dress 74.8
group 71.3
white 69.1
sport 68.5
people 67.2
bride 60.1
old 58
posing 50.5
vintage 28.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 2-8
Gender Female, 64.4%
Angry 50.8%
Sad 19.8%
Calm 16.2%
Confused 8.8%
Surprised 1.4%
Happy 1.2%
Disgusted 0.9%
Fear 0.9%

AWS Rekognition

Age 30-40
Gender Female, 75%
Confused 82.8%
Calm 11.2%
Sad 2%
Surprised 1.3%
Happy 0.8%
Angry 0.7%
Fear 0.7%
Disgusted 0.3%

AWS Rekognition

Age 7-17
Gender Female, 90.6%
Calm 46.3%
Disgusted 16.9%
Angry 15.2%
Happy 6.4%
Fear 4.8%
Sad 3.5%
Confused 3.5%
Surprised 3.4%

AWS Rekognition

Age 25-35
Gender Male, 93.2%
Disgusted 55.3%
Happy 21.7%
Sad 16.7%
Surprised 1.9%
Calm 1.6%
Fear 1.3%
Angry 1%
Confused 0.5%

AWS Rekognition

Age 26-36
Gender Female, 99.7%
Sad 86.5%
Angry 9.4%
Disgusted 2.2%
Surprised 0.8%
Fear 0.3%
Happy 0.3%
Calm 0.3%
Confused 0.2%

AWS Rekognition

Age 6-12
Gender Male, 97.6%
Calm 65.8%
Sad 14.4%
Angry 9.7%
Happy 2.2%
Fear 2.2%
Disgusted 2.2%
Confused 2.2%
Surprised 1.2%

AWS Rekognition

Age 13-21
Gender Female, 72.8%
Calm 48.2%
Sad 34.2%
Angry 7.7%
Disgusted 3.6%
Fear 2.3%
Confused 2%
Surprised 1.2%
Happy 0.8%

AWS Rekognition

Age 9-17
Gender Male, 69.4%
Sad 60.1%
Calm 17.2%
Fear 6.7%
Angry 4.8%
Confused 4.4%
Disgusted 3.2%
Happy 1.9%
Surprised 1.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Car
Shoe
Person 99.6%
Person 99.5%
Person 99.5%
Person 98.8%
Person 98.4%
Person 97.1%
Person 95.5%
Person 94.3%
Person 94.3%
Person 92%
Person 86.7%
Person 86%
Person 82.5%
Person 81.2%
Car 97.1%
Shoe 78.3%
Shoe 74.1%
Shoe 70.5%
Shoe 58.7%

Categories