Human Generated Data

Title

Untitled (conversation group)

Date

c. 1940

People

Artist: Joseph Woodson Whitesell, American 1876 - 1958

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1703

Human Generated Data

Title

Untitled (conversation group)

People

Artist: Joseph Woodson Whitesell, American 1876 - 1958

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1703

Machine Generated Data

Tags

Amazon
created on 2022-01-14

Person 99.1
Human 99.1
Person 98.8
Shoe 98.8
Clothing 98.8
Footwear 98.8
Apparel 98.8
Sitting 98.5
Person 98.3
Furniture 97.7
Chair 97.7
Person 94.6
Person 93.7
Person 93.3
Person 90.4
Chair 89.6
Chair 86.8
Crowd 82.4
Room 78.2
Indoors 78.2
Waiting Room 77.8
Shoe 66.2
Audience 66.1
Person 65.5
Art 58.5
People 55.7

Clarifai
created on 2023-10-25

people 100
furniture 99
group 99
room 98
adult 97.9
man 96.9
chair 96.3
group together 96
sit 94.8
administration 93.3
leader 92.7
child 91.7
seat 90.8
several 90
many 89.7
woman 89.7
wear 85.3
meeting 82.5
music 81.9
boy 80.8

Imagga
created on 2022-01-14

man 25.5
people 23.4
person 22.2
room 21.8
chair 18.3
male 15.6
adult 15.5
sitting 13.7
men 13.7
sexy 13.6
women 13.4
old 13.2
musical instrument 11.9
architecture 11.7
history 11.6
classroom 11.3
couple 11.3
attractive 11.2
home 11.2
love 11
portrait 11
model 10.9
family 10.7
interior 10.6
fashion 10.5
body 10.4
office 10.1
indoors 9.7
black 9.6
statue 9.5
hair 9.5
work 9.5
happiness 9.4
lifestyle 9.4
kin 9.3
two 9.3
historic 9.2
business 9.1
pretty 9.1
sculpture 8.9
lady 8.9
posing 8.9
art 8.7
historical 8.5
city 8.3
wind instrument 8.3
vintage 8.3
human 8.2
suit 8.2
sensual 8.2
sensuality 8.2
style 8.1
tourist 8.1
happy 8.1
religion 8.1
stringed instrument 8
ancient 7.8
teacher 7.7
erotic 7.6
elegance 7.5
dark 7.5
tourism 7.4
light 7.3
indoor 7.3
group 7.2
aged 7.2
antique 7.2
smile 7.1
working 7.1

Google
created on 2022-01-14

Microsoft
created on 2022-01-14

person 99.1
furniture 96.8
clothing 93.8
chair 87.9
group 82.9
text 81.4
man 76.4
people 75.7
footwear 68.3
music 60.1
crowd 1.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 59-67
Gender Female, 100%
Happy 99.6%
Calm 0.3%
Surprised 0%
Angry 0%
Sad 0%
Disgusted 0%
Fear 0%
Confused 0%

AWS Rekognition

Age 59-67
Gender Male, 99.6%
Calm 49.4%
Happy 32.5%
Angry 6.7%
Sad 3.4%
Surprised 3.1%
Disgusted 2%
Fear 1.6%
Confused 1.2%

AWS Rekognition

Age 23-31
Gender Male, 98.4%
Calm 26.9%
Sad 24.6%
Angry 23.5%
Fear 8.4%
Confused 7.2%
Disgusted 3.9%
Happy 3%
Surprised 2.4%

AWS Rekognition

Age 49-57
Gender Male, 100%
Calm 99.8%
Angry 0.1%
Sad 0%
Surprised 0%
Confused 0%
Disgusted 0%
Fear 0%
Happy 0%

AWS Rekognition

Age 50-58
Gender Male, 82.4%
Calm 95.5%
Sad 1.5%
Surprised 1.2%
Disgusted 0.5%
Angry 0.4%
Happy 0.4%
Fear 0.3%
Confused 0.2%

AWS Rekognition

Age 28-38
Gender Male, 62.5%
Calm 50.6%
Fear 23.7%
Sad 11.5%
Disgusted 3.3%
Surprised 3.3%
Angry 3.2%
Happy 2.6%
Confused 1.8%

Microsoft Cognitive Services

Age 26
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Shoe 98.8%
Chair 97.7%

Categories