Human Generated Data

Title

Untitled (five women in hats gathered around railing outdoors)

Date

1934

People

Artist: Curtis Studio, American active 1891 - 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13012

Human Generated Data

Title

Untitled (five women in hats gathered around railing outdoors)

People

Artist: Curtis Studio, American active 1891 - 1935

Date

1934

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Person 99.7
Human 99.7
Clothing 99.5
Apparel 99.5
Person 99.3
Person 98.9
Person 97
Person 96.7
People 84.6
Robe 84.3
Fashion 84.3
Gown 81.3
Wedding 70.2
Female 68.8
Military 66.6
Military Uniform 66.6
Wedding Gown 65.3
Art 62.1
Soldier 58.1
Food 56.8
Cake 56.8
Icing 56.8
Dessert 56.8
Cream 56.8
Creme 56.8
Photography 55.6
Photo 55.6

Imagga
created on 2022-02-05

negative 30.1
film 25.5
people 21.2
man 18.8
photographic paper 18.2
person 17.9
black 13.8
art 13.8
male 13.5
old 13.2
newspaper 12.9
sexy 12.8
adult 12.5
photographic equipment 12.1
portrait 11.6
statue 11.6
sculpture 11.6
body 11.2
love 11
model 10.9
vintage 10.7
room 10.6
human 10.5
lifestyle 10.1
fountain 10.1
sensuality 10
history 9.8
product 9.1
groom 9
barbershop 8.7
antique 8.6
ancient 8.6
bride 8.6
attractive 8.4
relaxation 8.4
dark 8.3
fashion 8.3
style 8.2
happy 8.1
dress 8.1
symbol 8.1
religion 8.1
light 8
spectator 8
couple 7.8
architecture 7.8
marble 7.7
men 7.7
pretty 7.7
grunge 7.7
two 7.6
structure 7.6
wedding 7.4
girls 7.3
creation 7.2
wet 7.2
hair 7.1
romantic 7.1
women 7.1
posing 7.1

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

text 89.5
outdoor 87.3
clothing 82.5
person 80.3
black and white 78.8
old 50.2

Face analysis

Amazon

Google

AWS Rekognition

Age 27-37
Gender Male, 90.8%
Confused 34.6%
Calm 30.7%
Sad 17.9%
Happy 6.2%
Surprised 4.4%
Disgusted 3.1%
Fear 2.2%
Angry 0.9%

AWS Rekognition

Age 19-27
Gender Female, 74.8%
Happy 76.2%
Calm 17.6%
Disgusted 2.4%
Surprised 1.8%
Confused 0.8%
Fear 0.5%
Sad 0.5%
Angry 0.3%

AWS Rekognition

Age 43-51
Gender Female, 90.2%
Calm 74.6%
Happy 14.2%
Sad 3.8%
Surprised 2.2%
Confused 2.1%
Disgusted 1.4%
Angry 1%
Fear 0.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Captions

Microsoft

a group of people standing in front of a building 79.5%
a group of people standing in front of a store 66.9%
an old photo of a person 66.8%

Text analysis

Amazon

use