Human Generated Data

Title

Untitled (girls basketball team)

Date

c. 1925

People

Artist: Hamblin Studio, American active 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1902

Human Generated Data

Title

Untitled (girls basketball team)

People

Artist: Hamblin Studio, American active 1930s

Date

c. 1925

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1902

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.5
Human 99.5
Person 99.1
Person 98.9
Person 98.7
People 97.4
Person 97.4
Person 96.1
Clothing 95.9
Apparel 95.9
Person 95.4
Person 94.7
Person 94
Family 88.8
Person 80.3
Face 72.2
Tree 71.9
Plant 71.9
Suit 68.8
Coat 68.8
Overcoat 68.8
Robe 62.9
Fashion 62.9
Vegetation 62.6
Photography 62.3
Photo 62.3
Female 62.2
Wedding 59.6
Gown 55.8

Clarifai
created on 2023-10-25

people 100
group 99.3
group together 99.1
man 98.3
adult 98.3
child 98.2
woman 95.5
many 93.2
several 90.7
leader 89.8
monochrome 88.5
outfit 87.4
recreation 87.1
three 87
wear 85.9
boy 84.7
family 80.8
print 80
beach 79.6
four 78.5

Imagga
created on 2021-12-14

graffito 38.8
decoration 27.7
snow 25.4
man 19.5
people 15
wheeled vehicle 14
black 13.8
sport 13.6
winter 13.6
portrait 13.6
outdoors 13.4
weather 13.2
swing 12.8
person 12.6
male 12.1
model 11.7
mechanical device 11.6
tricycle 11.4
adult 11
dress 10.8
vehicle 10.8
park 10.7
outdoor 10.7
sexy 10.4
cold 10.3
fashion 9.8
plaything 9.7
drawing 9.4
happy 9.4
grunge 9.4
fun 9
body 8.8
forest 8.7
water 8.7
play 8.6
happiness 8.6
mechanism 8.5
tree 8.5
art 8.5
power 8.4
old 8.4
dark 8.3
style 8.2
sketch 8
lifestyle 7.9
ice 7.9
hair 7.9
couple 7.8
face 7.8
fear 7.7
run 7.7
culture 7.7
fence 7.6
traditional 7.5
silhouette 7.4
structure 7.4
women 7.1
cool 7.1
summer 7.1
day 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

person 93.2
text 92.4
clothing 87.4
tree 66.8
old 52.8
posing 44.2
picture frame 8.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 34-50
Gender Female, 95%
Sad 80%
Calm 15.6%
Confused 2.4%
Happy 0.8%
Fear 0.5%
Angry 0.4%
Surprised 0.2%
Disgusted 0.1%

AWS Rekognition

Age 21-33
Gender Male, 55.7%
Angry 39.6%
Fear 31.3%
Surprised 9.1%
Happy 9%
Calm 5%
Disgusted 2.2%
Sad 1.9%
Confused 1.9%

AWS Rekognition

Age 24-38
Gender Male, 88.5%
Calm 85.4%
Sad 11.5%
Happy 2.1%
Confused 0.4%
Angry 0.3%
Surprised 0.1%
Fear 0.1%
Disgusted 0.1%

AWS Rekognition

Age 44-62
Gender Male, 87.8%
Happy 27.6%
Surprised 22.7%
Angry 22%
Calm 17.5%
Fear 6.1%
Sad 1.8%
Confused 1.3%
Disgusted 0.9%

AWS Rekognition

Age 23-37
Gender Female, 61.2%
Calm 79.5%
Sad 15%
Happy 2.2%
Fear 1.3%
Confused 0.6%
Angry 0.6%
Surprised 0.6%
Disgusted 0.2%

AWS Rekognition

Age 46-64
Gender Female, 50.3%
Calm 77.4%
Sad 15.9%
Happy 4.4%
Confused 1.4%
Angry 0.3%
Disgusted 0.2%
Surprised 0.2%
Fear 0.1%

AWS Rekognition

Age 44-62
Gender Female, 81.6%
Calm 41.8%
Happy 34.5%
Sad 16%
Angry 2.5%
Surprised 1.8%
Confused 1.7%
Fear 1.5%
Disgusted 0.3%

AWS Rekognition

Age 22-34
Gender Female, 83.2%
Calm 77.5%
Sad 11%
Happy 10%
Angry 0.6%
Confused 0.4%
Fear 0.3%
Surprised 0.2%
Disgusted 0.2%

AWS Rekognition

Age 24-38
Gender Female, 93.3%
Sad 60.3%
Calm 13.7%
Fear 13.5%
Happy 8.7%
Confused 1.7%
Surprised 1.2%
Disgusted 0.6%
Angry 0.4%

AWS Rekognition

Age 47-65
Gender Female, 60.2%
Calm 50.7%
Sad 16.9%
Confused 10.1%
Surprised 9.6%
Happy 8.1%
Angry 2%
Fear 1.3%
Disgusted 1.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Feature analysis

Amazon

Person 99.5%

Categories

Imagga

paintings art 99.9%