Human Generated Data

Title

Untitled (woman and man sitting on chairs in crowd and hugging)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15020

Human Generated Data

Title

Untitled (woman and man sitting on chairs in crowd and hugging)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Clothing 99.8
Apparel 99.8
Person 98.5
Human 98.5
Person 97.3
Person 97.2
Person 97.1
Female 92.1
Person 88.1
Gown 85.9
Fashion 85.9
Robe 84.2
Face 81.4
Woman 77.7
Dress 76.5
Person 75
Collage 74.6
Advertisement 74.6
Poster 74.6
Wedding 74.5
Bridegroom 68.1
People 67.7
Wedding Gown 67.1
Art 65.7
Drawing 65.7
Furniture 65.3
Girl 64.6
Coat 62.9
Suit 62.9
Overcoat 62.9
Person 62.1
Photo 61.7
Photography 61.7
Indoors 58.5
Evening Dress 57.6
Leisure Activities 57.3
Dance Pose 56.8
Floor 55.5
Bride 55.3
Person 50.8
Person 49.5

Imagga
created on 2022-03-05

negative 59.4
film 49.5
photographic paper 36.7
photographic equipment 24.5
people 17.3
person 14.9
adult 13.1
art 13
fashion 12.8
man 12.1
hair 11.1
face 10.6
portrait 10.3
grunge 10.2
sexy 9.6
black 9.6
pattern 9.6
graphic 9.5
love 9.5
shower cap 9.4
clothing 9.4
cap 9.4
winter 9.4
modern 9.1
silhouette 9.1
attractive 9.1
technology 8.9
style 8.9
science 8.9
drawing 8.9
bride 8.8
sketch 8.7
elegant 8.6
pretty 8.4
elegance 8.4
human 8.2
dress 8.1
glass 8.1
symbol 8.1
decoration 8
design 7.9
couple 7.8
groom 7.7
magic 7.6
skin 7.6
finance 7.6
poster 7.6
vintage 7.4
retro 7.4
wedding 7.4
light 7.3
lady 7.3
music 7.2
currency 7.2
celebration 7.2
holiday 7.2
night 7.1
decor 7.1
professional 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 98.6
drawing 97.1
sketch 94.8
cartoon 91.4
art 84.7
black and white 71.3
poster 63.3
clothing 59.8
woman 58.4
human face 57.8
painting 57.4
person 55.5
wedding dress 53

Face analysis

Amazon

Google

AWS Rekognition

Age 50-58
Gender Male, 99.8%
Calm 56.6%
Happy 11.1%
Confused 10.5%
Surprised 8.5%
Angry 6%
Sad 3.2%
Disgusted 2.8%
Fear 1.3%

AWS Rekognition

Age 40-48
Gender Female, 59.3%
Calm 94.5%
Sad 2.1%
Confused 1.4%
Happy 0.9%
Angry 0.3%
Disgusted 0.3%
Surprised 0.2%
Fear 0.2%

AWS Rekognition

Age 39-47
Gender Female, 94.5%
Sad 50.1%
Calm 39.6%
Confused 5.8%
Happy 2.7%
Disgusted 0.6%
Surprised 0.4%
Angry 0.4%
Fear 0.3%

AWS Rekognition

Age 21-29
Gender Female, 99.8%
Sad 42.1%
Surprised 29.9%
Calm 14%
Fear 9.8%
Disgusted 1.4%
Confused 1.4%
Angry 1%
Happy 0.4%

AWS Rekognition

Age 50-58
Gender Male, 96.9%
Happy 58.6%
Confused 14.7%
Calm 13.2%
Sad 5.4%
Disgusted 3.4%
Surprised 2.3%
Fear 1.4%
Angry 1%

AWS Rekognition

Age 45-51
Gender Female, 66.3%
Sad 37.6%
Calm 37%
Confused 11.5%
Angry 4.5%
Happy 4.4%
Disgusted 2.6%
Surprised 1.5%
Fear 0.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Person 98.5%

Captions

Microsoft

a group of people looking at a book 29%
a group of people around each other 28.9%