Human Generated Data

Title

Untitled (group of men with dog, posed outdoors, mountain scenery)

Date

c.1910

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.22065

Human Generated Data

Title

Untitled (group of men with dog, posed outdoors, mountain scenery)

People

Artist: Durette Studio, American 20th century

Date

c.1910

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Human 99.7
Person 99.7
Person 99.7
Person 99.5
Apparel 99.4
Clothing 99.4
Person 96.8
Person 92.6
People 91.6
Person 81.1
Couch 79.5
Furniture 79.5
Shorts 68.2
Hat 67
Face 64.3
Portrait 62.5
Photography 62.5
Photo 62.5
Coat 59.7
Suit 59.7
Overcoat 59.7
Road 58.9
Family 58.7

Imagga
created on 2022-03-11

person 26.3
hut 25.1
man 24.2
people 21.2
shelter 20.5
adult 20.1
silhouette 19.9
male 17.7
athlete 16.6
structure 15.3
sunset 15.3
dark 15
world 14.8
outdoor 13
ballplayer 12.8
water 12.7
sport 12.3
performer 12.3
dancer 12.2
dirty 11.7
player 11.7
light 11.5
portrait 11
beach 11
lifestyle 10.8
travel 10.6
attractive 10.5
fun 10.5
sexy 10.4
ocean 10
fashion 9.8
one 9.7
sky 9.6
kin 9.5
outdoors 9.5
grunge 9.4
child 9.3
action 9.3
exercise 9.1
pose 9.1
fitness 9
landscape 8.9
cool 8.9
body 8.8
contestant 8.8
model 8.6
canvas tent 8.5
old 8.4
leisure 8.3
human 8.2
style 8.2
posing 8
night 8
couple 7.8
darkness 7.8
soldier 7.8
rock 7.8
summer 7.7
motion 7.7
enjoy 7.5
happy 7.5
evening 7.5
active 7.4
fan 7.4
holding 7.4
vacation 7.4
protection 7.3
danger 7.3
industrial 7.3
shadow 7.2
wet 7.2
love 7.1
dance 7

Google
created on 2022-03-11

Microsoft
created on 2022-03-11

outdoor 99.1
clothing 94.5
person 92.2
text 89.7
player 89.3
man 85.8
old 62.7
footwear 52.9
posing 50.1

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Male, 87.4%
Calm 74.1%
Happy 9%
Sad 7.8%
Fear 6.6%
Surprised 1.2%
Disgusted 0.7%
Angry 0.4%
Confused 0.2%

AWS Rekognition

Age 36-44
Gender Male, 78.8%
Calm 82.6%
Happy 10.4%
Sad 3%
Disgusted 1.8%
Angry 0.7%
Surprised 0.7%
Confused 0.5%
Fear 0.3%

AWS Rekognition

Age 45-51
Gender Female, 95.9%
Calm 54.2%
Happy 44.8%
Surprised 0.4%
Confused 0.2%
Sad 0.1%
Disgusted 0.1%
Fear 0.1%
Angry 0.1%

AWS Rekognition

Age 27-37
Gender Male, 87.7%
Happy 91.2%
Sad 3%
Disgusted 1.6%
Surprised 1.3%
Angry 1%
Calm 0.8%
Confused 0.6%
Fear 0.5%

AWS Rekognition

Age 34-42
Gender Male, 96.5%
Happy 50.7%
Surprised 18.2%
Sad 17.1%
Calm 5%
Fear 2.8%
Disgusted 2.6%
Confused 2.2%
Angry 1.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Captions

Microsoft

a group of baseball players standing on top of a field 81.2%
a group of baseball players standing on top of a grass covered field 75.3%
a group of baseball players posing for a photo 75.2%