Human Generated Data

Title

Untitled (woman with two dogs)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16615

Human Generated Data

Title

Untitled (woman with two dogs)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16615

Machine Generated Data

Tags

Amazon
created on 2022-02-12

Clothing 99.1
Apparel 99.1
Person 97.3
Human 97.3
Dress 97.2
Female 96
Face 92
Woman 85.2
Nature 82.5
Chair 82
Furniture 82
Person 81
Outdoors 80.7
Shorts 76.1
Plant 73.5
Tree 71.5
Yard 68.7
Girl 67
Portrait 65.5
Photography 65.5
Photo 65.5
Transportation 56.2
Glasses 56
Accessories 56
Accessory 56

Clarifai
created on 2023-10-29

people 99.9
adult 99.1
gown (clothing) 98.7
woman 97.6
portrait 96.7
wear 96.6
one 96.5
two 93.1
actress 92.9
facial expression 90.5
man 86.9
administration 86.8
sit 85
kimono 82.8
dress 81.7
print 78.9
furniture 77.3
actor 76.7
leader 75.9
musician 74.9

Imagga
created on 2022-02-12

statue 36
man 18.8
architecture 18
person 17.6
adult 16.9
people 16.7
sculpture 16.7
building 16.3
historical 15
dress 14.5
male 14.3
jacket 13.5
history 13.4
monument 13.1
landmark 12.6
portrait 12.3
clothing 12.3
detail 12.1
world 11.9
old 11.8
city 11.6
black 11.5
face 11.4
travel 11.3
art 11.2
book jacket 10.8
religion 10.8
marble 10.8
tourism 10.7
culture 10.3
historic 10.1
attractive 9.8
stone 9.4
famous 9.3
street 9.2
tourist 8.6
exterior 8.3
fashion 8.3
human 8.2
symbol 8.1
structure 8
business 7.9
couple 7.8
wall 7.8
covering 7.8
ancient 7.8
men 7.7
god 7.7
newspaper 7.6
clothes 7.5
outdoors 7.5
holding 7.4
alone 7.3
decoration 7.2
looking 7.2
memorial 7.2
cool 7.1
love 7.1

Google
created on 2022-02-12

Microsoft
created on 2022-02-12

text 96.9
outdoor 92.9
clothing 91.5
person 89.1
smile 88.5
human face 88.3
black and white 66.7
dress 61.2
woman 58.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Male, 73.3%
Surprised 58.8%
Calm 30.6%
Happy 7.6%
Angry 0.9%
Confused 0.9%
Disgusted 0.6%
Sad 0.5%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 97.3%
Person 81%

Categories

Text analysis

Amazon

AS