Human Generated Data

Title

Untitled (two men holding women on their shoulder, American flag, photographs)

Date

c.1910

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.22061

Human Generated Data

Title

Untitled (two men holding women on their shoulder, American flag, photographs)

People

Artist: Durette Studio, American 20th century

Date

c.1910

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.22061

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Person 98.4
Human 98.4
Clothing 95.5
Apparel 95.5
Person 95.4
Person 95
Person 91.8
Home Decor 82.2
Shorts 74.4
Advertisement 65.1
Undershirt 64.6
Floor 63.9
Poster 61.3
Pants 60.2
Shoe 59.9
Footwear 59.9
Wall 57.8
Flooring 57.7
Collage 57.7
Back 56.3
Indoors 55.8
Shoe 53.6
Person 43.9

Clarifai
created on 2023-10-22

people 99.7
woman 95.7
child 95.5
adult 94.9
wear 93.9
man 92.5
two 92.4
portrait 92.1
one 90.8
art 89.8
room 85.9
actor 85.6
family 84.1
indoors 83.7
boy 82.6
group 82.1
girl 81
street 80.5
actress 79.6
son 78.1

Imagga
created on 2022-03-11

cell 21.4
wall 20.7
old 18.8
architecture 18
house 16.7
building 15.9
door 15.8
device 15.6
people 15.1
person 14.8
call 14.8
ancient 14.7
city 14.1
hair 13.5
window 12.7
history 12.5
fashion 12.1
room 12
man 11.4
sexy 11.2
adult 11.1
historic 11
portrait 11
black 10.9
body 10.4
home 10.4
dress 9.9
travel 9.9
human 9.7
locker 9.6
refrigerator 9.5
historical 9.4
skin 9.3
face 9.2
attractive 9.1
sensuality 9.1
toilet 8.9
interior 8.8
love 8.7
white goods 8.6
cute 8.6
bathroom 8.6
model 8.6
art 8.5
vintage 8.3
tourism 8.3
lady 8.1
brown 8.1
fastener 8
lifestyle 8
urban 7.9
happiness 7.8
pretty 7.7
sculpture 7.6
stone 7.6
one 7.5
furniture 7.4
light 7.4
male 7.1
women 7.1
posing 7.1
home appliance 7

Microsoft
created on 2022-03-11

wall 97.5
clothing 96.9
person 93.5
text 87.6
black and white 84.2
footwear 83.2
drawing 81.8
man 65.7
sketch 64.9
human face 54.4
posing 37.1
clothes 25.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 33-41
Gender Male, 99.8%
Calm 76.9%
Surprised 12.4%
Disgusted 3.3%
Happy 2.1%
Fear 1.5%
Angry 1.4%
Sad 1.3%
Confused 1.2%

AWS Rekognition

Age 24-34
Gender Male, 100%
Sad 90.7%
Confused 2.6%
Angry 2%
Disgusted 2%
Happy 1.7%
Fear 0.4%
Surprised 0.3%
Calm 0.2%

AWS Rekognition

Age 43-51
Gender Male, 100%
Happy 45%
Surprised 20.4%
Fear 12.3%
Calm 9.5%
Confused 5%
Sad 3.3%
Disgusted 3.1%
Angry 1.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 98.4%
Person 95.4%
Person 95%
Person 91.8%
Person 43.9%
Shoe 59.9%
Shoe 53.6%

Categories