Human Generated Data

Title

Untitled (bride, groom, and wedding party in front of house)

Date

1950

People

Artist: Samuel Cooper, American active 1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19494

Human Generated Data

Title

Untitled (bride, groom, and wedding party in front of house)

People

Artist: Samuel Cooper, American active 1950s

Date

1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-10-29

Human 99.5
Person 99.5
Person 99
Apparel 98.9
Clothing 98.9
Person 98.6
Person 96.7
Person 96.3
Person 92.7
Nature 92.2
Outdoors 91.7
Dress 89.9
Tree 89.9
Plant 89.9
Face 89.4
Female 86.9
Vegetation 76.4
Chair 76.3
Furniture 76.3
Person 76
People 73.9
Woman 67.2
Person 66.3
Fashion 65.6
Robe 65.6
Child 64.4
Kid 64.4
Gown 64.2
Girl 63.6
Photo 62.3
Photography 62.3
Grass 59
Overcoat 57.2
Coat 57.2
Ice 56.3
Person 43.5

Clarifai
created on 2019-10-29

people 100
group 99.8
group together 99.5
adult 98.9
many 98.5
woman 98.3
several 97.7
wear 95.7
man 95.2
child 95.2
outfit 94.6
five 93.3
administration 90.7
four 90.4
recreation 89.8
offspring 89.5
facial expression 89.3
actor 88.8
actress 88.7
three 87.4

Imagga
created on 2019-10-29

barbershop 49.1
shop 39.1
mercantile establishment 30.3
man 21.5
place of business 20.2
city 19.1
people 17.3
business 15.8
person 15.6
window 15.1
building 14.4
musical instrument 14.3
male 14.2
urban 14
men 12.9
world 12.7
street 12
architecture 10.9
adult 10.8
businessman 10.6
couple 10.4
establishment 10.1
light 10
silhouette 9.9
travel 9.9
transportation 9
women 8.7
youth 8.5
future 8.4
old 8.4
office 8.3
door 8.3
transport 8.2
alone 8.2
working 8
together 7.9
black 7.8
accordion 7.7
walk 7.6
house 7.5
happy 7.5
dark 7.5
worker 7.2
wind instrument 7.2
family 7.1

Google
created on 2019-10-29

Microsoft
created on 2019-10-29

clothing 96.9
person 95.3
outdoor 92.3
wedding dress 91.3
woman 87.6
dress 86.3
bride 85.1
text 83.2
man 76.3
black and white 74.9
footwear 71.8

Face analysis

Amazon

AWS Rekognition

Age 24-38
Gender Male, 53.7%
Angry 45.3%
Calm 45.5%
Disgusted 45%
Fear 45.4%
Happy 52.2%
Confused 45.5%
Surprised 45.9%
Sad 45.2%

AWS Rekognition

Age 36-52
Gender Male, 54.6%
Disgusted 45.1%
Fear 46.5%
Happy 47%
Calm 47.4%
Angry 45.4%
Sad 48.4%
Confused 45.1%
Surprised 45.2%

AWS Rekognition

Age 36-52
Gender Male, 51.9%
Disgusted 45%
Calm 45.3%
Surprised 45.1%
Confused 45.3%
Fear 46.2%
Happy 45.1%
Angry 46.1%
Sad 51.9%

AWS Rekognition

Age 44-62
Gender Male, 52.9%
Calm 51.4%
Disgusted 45.1%
Sad 45.2%
Angry 45.2%
Confused 45.6%
Fear 45.4%
Happy 45.2%
Surprised 47%

AWS Rekognition

Age 32-48
Gender Female, 51%
Happy 46.2%
Fear 45%
Sad 45%
Calm 53.7%
Confused 45%
Disgusted 45%
Angry 45%
Surprised 45%

AWS Rekognition

Age 30-46
Gender Male, 54.1%
Confused 46.6%
Fear 50%
Calm 45.6%
Angry 45.2%
Surprised 46.2%
Sad 45.9%
Happy 45.3%
Disgusted 45.2%

AWS Rekognition

Age 43-61
Gender Male, 54.6%
Angry 45%
Happy 45%
Disgusted 45%
Confused 45.2%
Fear 45.1%
Surprised 45%
Sad 54%
Calm 45.6%

AWS Rekognition

Age 37-55
Gender Female, 51%
Angry 46.5%
Calm 45.3%
Fear 45.7%
Confused 45.5%
Disgusted 45.3%
Happy 45.1%
Sad 51.5%
Surprised 45.1%

AWS Rekognition

Age 13-23
Gender Male, 53.8%
Sad 46.9%
Fear 49.4%
Angry 45.9%
Disgusted 45.5%
Surprised 45.4%
Calm 46.3%
Confused 45.4%
Happy 45.4%

Feature analysis

Amazon

Person 99.5%

Captions

Microsoft

a group of people standing in front of a building 70.9%
a group of people sitting in front of a building 63.8%
a person standing in front of a building 63.7%

Text analysis

Amazon

el
061..9