Human Generated Data

Title

Untitled (portrait of four young women in matching dresses seated outside house)

Date

c. 1945

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3417

Human Generated Data

Title

Untitled (portrait of four young women in matching dresses seated outside house)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3417

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Face 99.6
Human 99.6
Person 99.5
Clothing 99.5
Apparel 99.5
Person 99.4
Smile 98.8
Person 98.8
Person 98.2
Female 96.4
Vegetation 93.5
Plant 93.5
Outdoors 90.5
Nature 89.9
Tree 89.5
Woman 87.5
Portrait 85.2
Photography 85.2
Photo 85.2
Shelter 83.4
Countryside 83.4
Rural 83.4
Building 83.4
Man 78.8
Housing 77.6
People 76.2
Chair 74.1
Furniture 74.1
Grass 73.2
Girl 73.1
Street 69.5
Road 69.5
Urban 69.5
City 69.5
Town 69.5
Dress 67.8
Land 66.9
Laughing 66.6
Kid 65
Child 65
Door 63
Yard 60.3
Teen 60.1
Selfie 60
House 58.9
Skin 58.4
Alley 57.2
Alleyway 57.2
Beard 56
Villa 55.8

Clarifai
created on 2023-10-26

people 99.9
monochrome 99.7
portrait 98.8
adult 98.4
man 98.2
two 98
group 97.8
woman 97.4
three 97.1
family 96.1
wear 96
child 95.7
street 93.7
group together 92.3
veil 91.6
four 90.9
outfit 90.8
nostalgia 90.6
documentary 90.5
interaction 90.4

Imagga
created on 2022-01-22

people 25.1
person 21.2
man 20.8
male 20.7
adult 18.2
mask 16
black 14.6
portrait 14.2
dress 11.7
soldier 11.7
clothing 11.1
military 10.6
attractive 10.5
covering 10
protection 10
danger 10
face 9.9
vintage 9.9
religion 9.8
human 9.7
model 9.3
hand 9.1
art 9.1
world 8.9
statue 8.7
love 8.7
outdoor 8.4
power 8.4
old 8.4
dark 8.3
kin 8.3
uniform 8.3
jacket 8.3
one 8.2
outdoors 8.2
posing 8
happiness 7.8
men 7.7
war 7.7
culture 7.7
fashion 7.5
happy 7.5
leisure 7.5
sport 7.4
pose 7.2
sexy 7.2
lifestyle 7.2
negative 7.2

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

text 99.8
clothing 97.1
posing 97
window 96.9
person 96.2
human face 92.2
man 91
smile 88
black 80.2
old 52.8
image 40.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 25-35
Gender Female, 58.9%
Happy 75.5%
Calm 21.5%
Surprised 1.9%
Disgusted 0.3%
Sad 0.3%
Fear 0.2%
Angry 0.2%
Confused 0.1%

AWS Rekognition

Age 33-41
Gender Female, 77%
Happy 98.1%
Calm 1%
Surprised 0.5%
Angry 0.1%
Sad 0.1%
Fear 0.1%
Disgusted 0%
Confused 0%

AWS Rekognition

Age 24-34
Gender Female, 71.5%
Happy 50.7%
Surprised 46.8%
Fear 1.4%
Calm 0.6%
Angry 0.2%
Disgusted 0.1%
Confused 0.1%
Sad 0.1%

AWS Rekognition

Age 23-33
Gender Female, 97.2%
Calm 98.4%
Happy 1.2%
Angry 0.1%
Surprised 0.1%
Sad 0.1%
Disgusted 0.1%
Confused 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Categories