Human Generated Data

Title

Untitled (portrait of four women in living room)

Date

1974

People

Artist: Ken Whitmire Associates, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19789

Human Generated Data

Title

Untitled (portrait of four women in living room)

People

Artist: Ken Whitmire Associates, American

Date

1974

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19789

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Chair 100
Furniture 100
Clothing 99.4
Apparel 99.4
Person 99.2
Human 99.2
Person 98.8
Person 98.5
Person 98.4
Couch 95.3
Shorts 93.8
Living Room 85.7
Indoors 85.7
Room 85.7
Face 85.3
People 80
Monitor 73.3
Electronics 73.3
Display 73.3
Screen 73.3
Pants 70.4
Portrait 69.2
Photography 69.2
Photo 69.2
Kid 64
Child 64
Female 63.8
Shoe 61.9
Footwear 61.9
Suit 60.4
Coat 60.4
Overcoat 60.4
Floor 60.3
Flooring 60.1
Sitting 58.1
Boy 57.5
Girl 57

Clarifai
created on 2023-10-22

people 99.9
group 98.6
adult 97.7
woman 97.2
man 95.8
group together 93.8
child 93.4
wear 92.7
two 92.5
chair 90.2
sitting 90
outfit 89
sit 88.7
three 88.5
seat 87.8
actress 86.3
education 85.9
furniture 85.9
veil 84.2
actor 83.3

Imagga
created on 2022-03-05

kin 67.6
man 31.5
teacher 26.2
male 24.1
people 23.4
educator 19.9
couple 19.1
person 18.9
professional 18.7
adult 14.9
happy 13.8
sport 12.4
lifestyle 12.3
play 12
holding 11.5
men 11.1
women 11.1
park 10.7
happiness 10.2
smiling 10.1
leisure 10
family 9.8
businessman 9.7
outdoors 9.7
style 9.6
black 9.6
love 9.5
two 9.3
portrait 9
sitting 8.6
youth 8.5
business 8.5
room 8.5
active 8.4
outdoor 8.4
summer 8.3
dark 8.3
fashion 8.3
success 8
interior 8
together 7.9
day 7.8
player 7.8
boy 7.8
grandfather 7.8
color 7.8
dance 7.7
child 7.7
wife 7.6
power 7.5
joy 7.5
friends 7.5
fun 7.5
future 7.4
vintage 7.4
crutch 7.3
playing 7.3
team 7.2
kid 7.1
sky 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

drawing 80.7
clothing 79.1
person 74.4
cartoon 72.9
text 65.7
black and white 64.7
footwear 55.1
sketch 51.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 33-41
Gender Male, 99.3%
Happy 75.1%
Sad 21.1%
Fear 1.7%
Surprised 0.5%
Confused 0.5%
Calm 0.5%
Disgusted 0.3%
Angry 0.3%

AWS Rekognition

Age 33-41
Gender Female, 58.4%
Happy 86.2%
Disgusted 4.2%
Surprised 3.6%
Angry 2%
Sad 1.6%
Fear 0.9%
Confused 0.8%
Calm 0.7%

AWS Rekognition

Age 42-50
Gender Female, 60.5%
Calm 48.9%
Happy 40.1%
Disgusted 3%
Surprised 2.7%
Confused 2.6%
Angry 2.1%
Fear 0.4%
Sad 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.2%
Person 98.8%
Person 98.5%
Person 98.4%