Human Generated Data

Title

Untitled (two women, boy and girl seated on sofa)

Date

1937

People

Artist: Curtis Studio, American active 1891 - 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13124

Human Generated Data

Title

Untitled (two women, boy and girl seated on sofa)

People

Artist: Curtis Studio, American active 1891 - 1935

Date

1937

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13124

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Face 99.5
Human 99.5
Smile 99.5
Person 99.5
Person 99.4
Person 99.4
Dress 98.9
Clothing 98.9
Apparel 98.9
Person 98.8
Female 98.7
Woman 88.8
Kid 88.6
Child 88.6
Laughing 88.5
Girl 88.1
Teen 86.6
People 86.5
Furniture 85.5
Indoors 83.2
Head 81.9
Portrait 81
Photography 81
Photo 81
Costume 80.4
Couch 74
Room 73.6
Chair 70.7
Living Room 70.5
Door 69
Jar 64.3
Suit 63.8
Coat 63.8
Overcoat 63.8
Boy 63.6
Table 61.5
Vase 60.5
Pottery 60.5
Glasses 59
Accessories 59
Accessory 59
Plant 58.6
Hair 58.1
Housing 57.9
Building 57.9
Man 57.5
Lamp 56.2

Clarifai
created on 2023-10-27

people 99.9
child 99.5
group 98.9
portrait 97.8
monochrome 96.2
son 96
wear 95.2
outfit 95.2
nostalgic 95.1
nostalgia 94.6
family 94.4
boy 94.2
three 93.9
adult 93.5
two 93.4
veil 92.7
retro 92.4
art 92.3
music 91.9
girl 91.3

Imagga
created on 2022-01-29

shower cap 59.3
cap 49.4
headdress 38.1
clothing 27.1
man 22.2
person 20.3
people 20.1
world 19.2
portrait 18.1
adult 18.1
sexy 17.7
covering 15.8
attractive 15.4
male 15
happy 14.4
model 14
consumer goods 13.7
face 13.5
hair 13.5
black 13.2
fashion 12.8
human 12
looking 12
head 11.7
dress 11.7
body 11.2
lifestyle 10.8
studio 10.6
pretty 10.5
mask 10
smile 10
sunglasses 9.7
holiday 9.3
hand 9.1
rock 8.7
retro 8.2
danger 8.2
vacation 8.2
lady 8.1
water 8
work 7.8
couple 7.8
happiness 7.8
eyes 7.7
expression 7.7
dance 7.6
style 7.4
glasses 7.4
symbol 7.4
music 7.4
cute 7.2
art 7.2
women 7.1
lovely 7.1
posing 7.1
cool 7.1
love 7.1

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

text 99.2
clothing 93.3
person 92.8
window 92.3
human face 82.2
posing 77.9
smile 72.8

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 22-30
Gender Male, 99.5%
Surprised 63.4%
Fear 16.9%
Calm 8.6%
Happy 6.4%
Angry 1.5%
Sad 1.5%
Disgusted 1.5%
Confused 0.2%

AWS Rekognition

Age 25-35
Gender Female, 99.7%
Happy 79.6%
Surprised 16.6%
Fear 1.3%
Calm 0.8%
Angry 0.7%
Disgusted 0.7%
Sad 0.2%
Confused 0.2%

AWS Rekognition

Age 26-36
Gender Female, 69.8%
Happy 73.7%
Calm 18.3%
Surprised 6.8%
Disgusted 0.4%
Fear 0.3%
Angry 0.2%
Sad 0.2%
Confused 0.1%

AWS Rekognition

Age 30-40
Gender Female, 99.8%
Fear 42.2%
Happy 22.1%
Calm 17.7%
Surprised 10.3%
Angry 2.8%
Disgusted 2.1%
Sad 1.6%
Confused 1.1%

Feature analysis

Amazon

Person
Person 99.5%
Person 99.4%
Person 99.4%
Person 98.8%

Text analysis

Amazon

ع ع ع